fix: SDK API correctness — constructor, cypher result, VectorResult fields#2
fix: SDK API correctness — constructor, cypher result, VectorResult fields#2
Conversation
- client: support "host:port" string in AsyncCoordinodeClient constructor - test_types: fix _FakePV vector kind "vector" → "vector_value" - test_basic: fix health() (bool not dict), cypher() (List[Dict] not .columns/.rows) - langchain graph.py: query() — cypher() already returns List[Dict], no manual parse - llama-index base.py: fix get/get_triplets/get_rel_map (dict row access), vector_query (r.node.id, r.node.properties, r.distance), structured_query (direct return) - lint: remove unused imports and dead variable Closes #1
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 3 minutes and 9 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: ASSERTIVE Plan: Pro Run ID: 📒 Files selected for processing (6)
📝 WalkthroughWalkthroughClient constructors now accept and parse "host:port" (including bracketed IPv6); RPC result shapes changed to list-of-dicts and VectorResult fields; typing updated to PEP 604/built-in generics; Cypher generation/result parsing updated for LlamaIndex; tests, docs, CI, packaging, and proto handling adjusted accordingly. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes 🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@coordinode/_types.py`:
- Line 6: The file coordinode._types has a formatting mismatch around the import
statement (the line importing Any, Dict, List, Union) flagged by ruff; run the
formatter (e.g., `ruff format`) or apply the project's formatting rules to
reformat this file so the import line and file meet ruff's style checks, then
re-run `ruff format --check` to confirm the issue is resolved.
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 148-155: The rel_label extraction in get_rel_map() is wrong
because variable-length path [r*1..{depth}] returns relationship dicts, so
change get_rel_map() to either modify its Cypher to include type(r) AS rel_type
(like get_triplets() does) and read row["rel_type"], or keep the existing query
and extract the relationship type from the first relationship dict (e.g., if
rels is a list take rels[0].get("type", "RELATED") instead of str(rels[0]));
update the code that sets rels and rel_label to use the chosen approach (refer
to get_rel_map(), rels, rel_label, and get_triplets() for guidance).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: badaedb1-af49-4cdf-a4e6-30c7a7042df7
📒 Files selected for processing (6)
coordinode/_types.pycoordinode/client.pylangchain-coordinode/langchain_coordinode/graph.pyllama-index-coordinode/llama_index/graph_stores/coordinode/base.pytests/integration/test_basic.pytests/unit/test_types.py
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py
Outdated
Show resolved
Hide resolved
@_requires_proto on TestToPropertyValue — skips gracefully without make proto. TestFromPropertyValue (mock-based) always runs: 7 passed. Part of #1
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
tests/unit/test_types.py (2)
113-116:⚠️ Potential issue | 🟠 MajorUse
vector_valuein the vector assertion.Line 115 still asserts
pv.vector.values, but this suite now targets thevector_valueoneof field. This can fail once proto stubs are present andTestToPropertyValueruns.Suggested fix
def test_float_list_becomes_vector(self): pv = to_property_value([1.0, 2.0, 3.0]) - assert list(pv.vector.values) == pytest.approx([1.0, 2.0, 3.0]) + assert list(pv.vector_value.values) == pytest.approx([1.0, 2.0, 3.0])🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/unit/test_types.py` around lines 113 - 116, The test test_float_list_becomes_vector is asserting pv.vector.values but the proto now uses the oneof field vector_value; update the assertion to reference pv.vector_value (e.g., check pv.vector_value.values or pv.vector_value depending on expected shape) so the test uses the vector_value oneof returned by to_property_value; locate the assertion in test_float_list_becomes_vector and replace pv.vector.values with pv.vector_value.* as appropriate.
1-162:⚠️ Potential issue | 🟡 MinorCI blocker: file is not Ruff-formatted.
Pipeline reports
ruff format --checkwould reformat this file; please runruff formatbefore merge.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/unit/test_types.py` around lines 1 - 162, The file fails the ruff formatter check; run the auto-formatter and commit the changes: run `ruff format` (or your project's pre-commit/IDE formatting command) on tests/unit/test_types.py to reformat the module including classes _FakePV, _FakeVec, _FakeList, _FakeMap and the test classes TestToPropertyValue and TestFromPropertyValue so the file passes `ruff format --check` in CI.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@tests/unit/test_types.py`:
- Around line 113-116: The test test_float_list_becomes_vector is asserting
pv.vector.values but the proto now uses the oneof field vector_value; update the
assertion to reference pv.vector_value (e.g., check pv.vector_value.values or
pv.vector_value depending on expected shape) so the test uses the vector_value
oneof returned by to_property_value; locate the assertion in
test_float_list_becomes_vector and replace pv.vector.values with
pv.vector_value.* as appropriate.
- Around line 1-162: The file fails the ruff formatter check; run the
auto-formatter and commit the changes: run `ruff format` (or your project's
pre-commit/IDE formatting command) on tests/unit/test_types.py to reformat the
module including classes _FakePV, _FakeVec, _FakeList, _FakeMap and the test
classes TestToPropertyValue and TestFromPropertyValue so the file passes `ruff
format --check` in CI.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 46e07628-7bd6-4749-a08b-a5706939ceca
📒 Files selected for processing (1)
tests/unit/test_types.py
- Fix _proto/__init__.py: remove sys.path hack that shadowed google namespace package with generated _proto/google/api/ - Fix Makefile proto target: apply import fix to all *.py files (not just *_pb2_grpc.py); cypher_pb2.py also had absolute imports - Fix tests/unit/test_types.py: correct field names (vector_value, map_value.entries); test_none → null semantics, not raises - Fix tests/integration/test_basic.py: use RETURN n.name (id(n) not implemented in alpha); mark test_vector_search as xfail - ci.yml: switch to uv, add Python 3.13 to matrix, remove 3.9/3.10 (both past EOL Oct 2025) - pyproject.toml (all 3 packages): requires-python >=3.11, add 3.13 classifier, drop 3.9/3.10 - ruff.toml: exclude coordinode/_proto/ from linting (generated code) - Auto-fix ruff UP violations: Union→X|Y, Dict/List→dict/list, Optional→X|None, isinstance tuples→X|Y union syntax Tests: 21 passed, 1 xfailed (vector search — alpha server limitation)
VectorServiceImpl is a stub in server/src/services/vector.rs that always returns []. HNSW algorithm is implemented (coordinode-vector crate) but not wired to the RPC handler. Tracked as G007 in GAPS.md.
uv workspace: - Root pyproject.toml with [tool.uv.workspace] — members: coordinode, langchain-coordinode, llama-index-coordinode - uv.lock committed (89 packages, reproducible installs) - hatch-vcs: fallback-version = "0.0.0" + raw-options root = ".." so git version detection works from subdirectory - Makefile: `make install` now uses `uv sync`; `make install-pip` as pip fallback for non-uv contributors - CI updated: `uv sync` + `uv run` everywhere, no `uv pip install` Both pip and uv work without any wrapper: pip install coordinode # PyPI user uv add coordinode # uv user pip install -e coordinode/ # contributor (pip) uv sync # contributor (uv) READMEs: - Root README.md: badges, all 3 packages, uv+pip quickstart, dev setup (uv + pip), donation section with QR code - coordinode/README.md: PyPI page — connection options, type mapping table, vector search, async client - langchain-coordinode/README.md: PyPI page — GraphCypherQAChain, schema inspection, LLMGraphTransformer, add_graph_documents - llama-index-coordinode/README.md: PyPI page — PropertyGraphIndex from docs + from existing, vector+keyword hybrid retrieval, manual upsert, capabilities table - assets/usdt-qr.svg: USDT TRC-20 donation QR code (TFDsez...) - Sub-package READMEs reference QR via absolute GitHub raw URL (required for PyPI markdown rendering) - readme = "README.md" in each pyproject.toml (PyPI page) python-version classifiers: 3.11, 3.12, 3.13 (3.9/3.10 EOL) requires-python = ">=3.11" in all 3 packages
- Add release-please workflow + config + manifest (initial version 0.3.0) - release.yml: remove TestPyPI job, publish all tags directly to PyPI - coordinode/pyproject.toml: fix version-file path (_version.py, not coordinode/_version.py) - .gitignore: cover entire _proto/ dir and fix _version.py paths for subdirs
|
@coderabbitai Re: outside-diff comments on
|
|
Tip For best results, initiate chat on the files or code changes. 🧩 Analysis chain
I'll resolve both comments now. 🏁 Script executed: Length of output: 0 Let me try a different endpoint to locate those comments. 🏁 Script executed: Length of output: 0 🏁 Script executed: Length of output: 500 🏁 Script executed: Length of output: 923 🏁 Script executed: Length of output: 961
To summarize what was wrong:
Both are incorrect findings and can be safely disregarded. I'll make a note to be more careful about flagging outside-diff issues on files that haven't actually changed in the PR. ✏️ Learnings added
|
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
.github/workflows/ci.yml (1)
53-53: 🧹 Nitpick | 🔵 TrivialPin the integration service image to a fixed tag or digest.
Using
ghcr.io/structured-world/coordinode:latestcan cause flaky CI when upstream changes land. Prefer a version tag or immutable digest for reproducible runs.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/ci.yml at line 53, The workflow currently references a mutable image tag "image: ghcr.io/structured-world/coordinode:latest"; replace that with a fixed version tag or an immutable digest (e.g., "ghcr.io/structured-world/coordinode:<version>" or "ghcr.io/structured-world/coordinode@sha256:<digest>") to ensure reproducible CI runs—update the CI job definition that contains the `image:` key to use the chosen tag or digest and, if using a digest, obtain it from the registry (or a release) and paste the exact sha256 value.tests/unit/test_types.py (1)
38-41:⚠️ Potential issue | 🟡 Minor
_FakeMap.fieldsdoesn't match expected.entriesaccessor.The
_FakeMapclass uses.fieldsbutfrom_property_value(line 74 in_types.py) accesses.map_value.entries. While_FakeMapisn't currently used in theTestFromPropertyValuetests, this inconsistency could cause issues if map tests are added.🐛 Suggested fix
class _FakeMap: def __init__(self, fields): - self.fields = dict(fields) + self.entries = dict(fields)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/unit/test_types.py` around lines 38 - 41, The _FakeMap test helper exposes a .fields attribute but the code under test (from_property_value) expects a .map_value.entries accessor; update _FakeMap so it mimics the real protobuf structure by providing a .map_value object with an .entries mapping (or a .map_value property that returns an object with an .entries attribute) instead of or in addition to .fields so tests using from_property_value will find .map_value.entries as expected.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/release-please.yml:
- Around line 15-17: The workflow exposes outputs release_created and tag_name
via outputs: release_created: ${{ steps.rp.outputs.release_created }} and
tag_name: ${{ steps.rp.outputs.tag_name }} but no downstream job consumes them;
either remove these unused outputs from the outputs block or add/document the
intended consumer (e.g., a downstream job or an external workflow like
release.yml that triggers on tag push) so the outputs are actually used; update
the outputs block accordingly and, if keeping them, add a comment or README
describing how release.yml or other workflows should consume
steps.rp.outputs.release_created and steps.rp.outputs.tag_name.
In @.github/workflows/release.yml:
- Line 98: Remove the extra trailing blank line at the end of release.yml that
triggers YAML lint's empty-lines error; open .github/workflows/release.yml (the
release workflow file) and delete the final blank line so the file ends with the
last YAML content line (optionally followed by a single newline character) and
then save.
In `@coordinode/client.py`:
- Around line 104-107: The current host:port parsing in coordinode/client.py
using host.rsplit(":", 1) will break IPv6 addresses; update the block that
handles the "host:port" single-string (the host and port variables) to first
detect bracketed IPv6 form (e.g., "[::1]:7080") and extract the address between
'[' and ']' and the trailing port, otherwise only split on the last colon for
non-bracketed hosts but ensure you skip raw IPv6 without brackets (e.g.,
"2001:db8::1") by requiring bracket notation or rejecting/ignoring a port when
host contains multiple colons; implement this logic in the same place where host
and port are reassigned so existing callers (host, port) remain unchanged and
add a short comment documenting that unbracketed IPv6 must be passed as separate
host/port or use bracket notation.
In `@coordinode/pyproject.toml`:
- Around line 49-50: The configured VCS hook generates _version.py but the
package still hardcodes __version__ = "0.3.0a1"; update the package to use the
generated version by changing the __version__ assignment in
coordinode/__init__.py to import __version__ from _version.py with a safe
ImportError fallback, or remove the version-file = "_version.py" hatch config if
you intend to keep manual versioning; specifically modify the module-level
__version__ definition to attempt "from ._version import __version__" (fallback
to a sensible default) so the generated file is actually used.
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 62-64: In the get() method, ids is typed as list[str] but each
node_id is passed directly to self._client.get_node(node_id) which expects an
int; convert each node_id to int before calling get_node (e.g., call
self._client.get_node(int(node_id))) and ensure any downstream handling uses the
integer result accordingly.
In `@Makefile`:
- Line 1: The .PHONY declaration is out of sync: it lists install-uv but the
actual Makefile target is install-pip (and checkmake suggests adding an all
target). Update the .PHONY line to include install-pip instead of install-uv and
(optionally) add all to the .PHONY list so it matches the defined targets;
ensure the identifiers mentioned (.PHONY, install-uv, install-pip, all) map
exactly to the targets present elsewhere in the file.
- Around line 19-20: The sed invocation in the Makefile is using BSD-style
in-place editing (`sed -i ''`) which breaks on GNU sed; update the command that
targets generated Python files in $(PROTO_OUT) (the line with "sed -i '' 's/from
coordinode\./from coordinode._proto.coordinode./g'") to use a portable two-step
approach: run sed with a safe backup suffix (e.g. `-i.bak`) and then remove the
generated .bak files after the substitution to avoid leaving artifacts; ensure
the replacement string and target file selection remain identical so the
substitution and subsequent cleanup operate on the same files.
In `@tests/integration/test_basic.py`:
- Line 52: The compound assertion using found and found[0]["name"] should be
split so failures indicate which condition failed: first assert that the search
results exist (e.g., assert found or assert len(found) > 0 with a clear
message), then assert the first item's name equals "sdk-test-node" (assert
found[0]["name"] == "sdk-test-node"), referencing the found variable and the
index access found[0]["name"] to locate the check.
---
Outside diff comments:
In @.github/workflows/ci.yml:
- Line 53: The workflow currently references a mutable image tag "image:
ghcr.io/structured-world/coordinode:latest"; replace that with a fixed version
tag or an immutable digest (e.g.,
"ghcr.io/structured-world/coordinode:<version>" or
"ghcr.io/structured-world/coordinode@sha256:<digest>") to ensure reproducible CI
runs—update the CI job definition that contains the `image:` key to use the
chosen tag or digest and, if using a digest, obtain it from the registry (or a
release) and paste the exact sha256 value.
In `@tests/unit/test_types.py`:
- Around line 38-41: The _FakeMap test helper exposes a .fields attribute but
the code under test (from_property_value) expects a .map_value.entries accessor;
update _FakeMap so it mimics the real protobuf structure by providing a
.map_value object with an .entries mapping (or a .map_value property that
returns an object with an .entries attribute) instead of or in addition to
.fields so tests using from_property_value will find .map_value.entries as
expected.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 6a174617-f7ca-43af-a3b8-a064fe0350df
⛔ Files ignored due to path filters (2)
assets/usdt-qr.svgis excluded by!**/*.svguv.lockis excluded by!**/*.lock
📒 Files selected for processing (23)
.github/workflows/ci.yml.github/workflows/release-please.yml.github/workflows/release.yml.gitignore.release-please-manifest.jsonMakefileREADME.mdcoordinode/README.mdcoordinode/_proto/__init__.pycoordinode/_types.pycoordinode/client.pycoordinode/pyproject.tomllangchain-coordinode/README.mdlangchain-coordinode/langchain_coordinode/graph.pylangchain-coordinode/pyproject.tomlllama-index-coordinode/README.mdllama-index-coordinode/llama_index/graph_stores/coordinode/base.pyllama-index-coordinode/pyproject.tomlpyproject.tomlrelease-please-config.jsonruff.tomltests/integration/test_basic.pytests/unit/test_types.py
💤 Files with no reviewable changes (1)
- coordinode/_proto/init.py
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Pull request overview
This PR aligns the Python SDK + LangChain/LlamaIndex integrations and tests with the actual gRPC API behavior (constructor address parsing, health() return type, cypher() row shape, VectorResult fields), and modernizes repo tooling (uv workspace, ruff config, release-please automation, refreshed READMEs/CI).
Changes:
- Fix SDK surface/API expectations across core client, integrations, and tests (
health(),cypher(),VectorResult, host:port handling). - Add/refresh developer tooling and automation (uv workspace root, ruff config, release-please, updated CI/release workflows).
- Expand and split documentation into per-package READMEs and improve top-level README.
Reviewed changes
Copilot reviewed 22 out of 25 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| tests/unit/test_types.py | Updates PropertyValue conversion tests, adds proto-stub detection + skipping, aligns field names. |
| tests/integration/test_basic.py | Updates integration tests to new health() and cypher() return shapes; xfails vector search until server wiring exists. |
| ruff.toml | Adds ruff configuration and excludes generated proto stubs. |
| release-please-config.json | Adds release-please config for Python releases. |
| README.md | Refreshes top-level docs (badges, examples, dev setup, integration test instructions, support section). |
| pyproject.toml | Adds uv workspace root + shared dev dependency groups. |
| Makefile | Updates proto generation/install targets and switches to uv-based install path. |
| llama-index-coordinode/README.md | Adds package-specific README with usage examples and capabilities. |
| llama-index-coordinode/pyproject.toml | Updates Python requirement to 3.11+, readme location, hatch-vcs options/classifiers. |
| llama-index-coordinode/llama_index/graph_stores/coordinode/base.py | Fixes integration logic to match cypher() dict rows and VectorResult fields; updates typing. |
| langchain-coordinode/README.md | Adds package-specific README with usage examples and API notes. |
| langchain-coordinode/pyproject.toml | Updates Python requirement to 3.11+, readme location, hatch-vcs options/classifiers. |
| langchain-coordinode/langchain_coordinode/graph.py | Updates query() to return cypher()’s dict rows directly; typing cleanups. |
| coordinode/README.md | Adds package-specific README for the core SDK with updated examples and API notes. |
| coordinode/pyproject.toml | Updates Python requirement to 3.11+, hatch-vcs config/version file path, classifiers. |
| coordinode/client.py | Implements host:port parsing, updates return types, aligns VectorResult fields, typing cleanup. |
| coordinode/_types.py | Updates type aliases and PropertyValue conversion logic (but contains a runtime isinstance issue). |
| coordinode/_proto/init.py | Removes sys.path mutation logic for generated stubs. |
| assets/usdt-qr.svg | Adds donation QR asset referenced by READMEs. |
| .release-please-manifest.json | Adds release-please manifest with initial version. |
| .gitignore | Adjusts ignores for generated proto stubs and hatch-vcs version files; adds GAPS.md ignore. |
| .github/workflows/release.yml | Updates release pipeline to use release-please + uv and simplifies publishing flow. |
| .github/workflows/release-please.yml | Adds release-please workflow definition. |
| .github/workflows/ci.yml | Switches CI to uv, updates Python matrix to 3.11–3.13, runs proto generation/tests via uv. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py (2)
80-105:⚠️ Potential issue | 🟠 MajorThese public filter arguments still silently no-op.
get_triplets()never consumesproperties/ids, anddelete()never consumesrelation_names/properties. Callers supplying those arguments currently get a successful call with the wrong result set. Please either implement the predicates or fail fast withNotImplementedErrorfor unsupported combinations.Also applies to: 188-201
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py` around lines 80 - 105, get_triplets currently ignores the properties and ids parameters and delete ignores relation_names and properties, causing silent no-ops; update get_triplets (function get_triplets) to either apply filters for properties and ids into the Cypher WHERE clause (e.g., add predicates against node properties and id(n)/id(m) using params) or explicitly raise NotImplementedError when those args are provided, and similarly update delete (method delete) to either incorporate relation_names and properties into the deletion MATCH/WHERE pattern or raise NotImplementedError if those combinations are unsupported; ensure you reference and populate the params dict and adjust rel_pattern/where construction consistently so callers get a correct filtered result or a clear failure.
123-156:⚠️ Potential issue | 🟠 Major
ignore_relsis accepted but never applied.
get_rel_map(..., ignore_rels=[...])still returns paths containing those relationship types because the parameter only appears in a comment. That breaks callers relying on the exclusion filter. Filterrin Cypher (ALL(rel IN r WHERE type(rel) NOT IN $ignore_rels)) or skip matching rows before building theRelation.Example Python-side guard
result = self._client.cypher(cypher, params={"ids": ids}) + ignored = set(ignore_rels or []) triplets: list[list[LabelledNode]] = [] for row in result: + rels = row.get("r", []) + if ignored and isinstance(rels, list): + if any(isinstance(rel, dict) and rel.get("type") in ignored for rel in rels): + continue src_data = row.get("n", {}) dst_data = row.get("m", {}) src_id = str(row.get("_src_id", "")) dst_id = str(row.get("_dst_id", "")) - # Variable-length path [r*1..N] returns a list of relationship dicts. - rels = row.get("r", []) if isinstance(rels, list) and rels:🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py` around lines 123 - 156, The ignore_rels parameter of get_rel_map is never used; update the cypher built in get_rel_map (function get_rel_map in base.py) to include a WHERE clause filtering variable-length relationships, e.g. add "AND ALL(rel IN r WHERE type(rel) NOT IN $ignore_rels)" when ignore_rels is provided and pass ignore_rels in params, or alternatively after fetching result rows filter out any row whose rels (row["r"]) contains a relationship whose type is in ignore_rels before constructing LabelledNode triplets; ensure you reference ids and _src_id/_dst_id logic unchanged and only apply the ignore_rels check to the rels list.
♻️ Duplicate comments (1)
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py (1)
62-64:⚠️ Potential issue | 🔴 Critical
get(ids=...)still assumes callers already have graph-internal integer IDs.
idsis declared aslist[str], but Line 64 forwards each value straight intoget_node(). This is the same blocker from the earlier review: the client API expects the integer graph ID, so this branch is still brittle and will mis-handle the string IDs exposed by the adapter. Convert here or switch the lookup to the persistedn.idproperty instead.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py` around lines 62 - 64, The get(ids=...) branch in the get method incorrectly forwards string IDs to self._client.get_node; convert each incoming string adapter ID to the internal integer graph ID before calling get_node (or resolve via the persisted node identity like n.id) so the client receives the expected integer ID; update the loop that iterates over ids to perform the lookup/translation (e.g., map adapter ID -> internal ID) and then call self._client.get_node with that integer.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 68-70: The Cypher is vulnerable because property keys, node
labels, and relationship types are interpolated directly; create a small helper
(e.g., escape_cypher_identifier(name)) that validates and safely
backtick-escapes identifiers (reject or sanitize names containing backticks,
spaces, colons, or other illegal chars), then use that helper when building
where_clauses (wrapping each property key as `n.{escaped_key}`) and when
composing any label/relationship/type patterns in get_triplets() (apply to
relation_names and any label interpolations) so only sanitized, quoted
identifiers are injected while keeping property values as parameters.
---
Outside diff comments:
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 80-105: get_triplets currently ignores the properties and ids
parameters and delete ignores relation_names and properties, causing silent
no-ops; update get_triplets (function get_triplets) to either apply filters for
properties and ids into the Cypher WHERE clause (e.g., add predicates against
node properties and id(n)/id(m) using params) or explicitly raise
NotImplementedError when those args are provided, and similarly update delete
(method delete) to either incorporate relation_names and properties into the
deletion MATCH/WHERE pattern or raise NotImplementedError if those combinations
are unsupported; ensure you reference and populate the params dict and adjust
rel_pattern/where construction consistently so callers get a correct filtered
result or a clear failure.
- Around line 123-156: The ignore_rels parameter of get_rel_map is never used;
update the cypher built in get_rel_map (function get_rel_map in base.py) to
include a WHERE clause filtering variable-length relationships, e.g. add "AND
ALL(rel IN r WHERE type(rel) NOT IN $ignore_rels)" when ignore_rels is provided
and pass ignore_rels in params, or alternatively after fetching result rows
filter out any row whose rels (row["r"]) contains a relationship whose type is
in ignore_rels before constructing LabelledNode triplets; ensure you reference
ids and _src_id/_dst_id logic unchanged and only apply the ignore_rels check to
the rels list.
---
Duplicate comments:
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 62-64: The get(ids=...) branch in the get method incorrectly
forwards string IDs to self._client.get_node; convert each incoming string
adapter ID to the internal integer graph ID before calling get_node (or resolve
via the persisted node identity like n.id) so the client receives the expected
integer ID; update the loop that iterates over ids to perform the
lookup/translation (e.g., map adapter ID -> internal ID) and then call
self._client.get_node with that integer.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 467f8d62-23c3-4fc2-878a-5e5a6159ffad
📒 Files selected for processing (1)
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py
Outdated
Show resolved
Hide resolved
…ifiers - add _cypher_ident(): backtick-escapes labels, rel-types, property keys - get(): lookup nodes by n.id (string) instead of get_node() (expects int) - get(): escape property keys in WHERE clause - get_triplets(): escape relation_names; raise NotImplementedError for properties/ids - upsert_relations(): escape rel.label - get_rel_map(): implement ignore_rels Python-side filter - delete(): raise NotImplementedError for relation_names/properties fix(client): parse host:port regardless of default port value fix(build): portable sed -i.bak instead of macOS-only sed -i '' fix(build): sync .PHONY — install-uv to install-pip fix(release): remove unused outputs from release-please.yml fix(release): remove trailing blank line in release.yml feat(coordinode): import __version__ from hatch-vcs _version.py test(integration): split compound assert for clearer failure messages
|
@coderabbitai Re: outside-diff findings on base.py:
|
|
Tip For best results, initiate chat on the files or code changes.
Appreciate the thorough follow-through on those findings! 🧠 Learnings used |
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py (1)
215-222:⚠️ Potential issue | 🟡 MinorInconsistent ID semantics between
get()anddelete().
get()queries by then.idproperty (string adapter ID), butdelete()usesid(n)(internal graph integer ID). If callers pass the sameidslist to both methods,delete()will target different nodes.Consider aligning
delete()to also usen.id IN $ids:♻️ Suggested alignment
if ids: - cypher = "MATCH (n) WHERE id(n) IN $ids DETACH DELETE n" + cypher = "MATCH (n) WHERE n.id IN $ids DETACH DELETE n" self._client.cypher(cypher, params={"ids": ids})🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py` around lines 215 - 222, The delete() implementation in base.py uses the Neo4j internal integer id(n) but get() filters by the node property n.id (string adapter ID), causing mismatch; update the delete() method (the branch that handles ids) to filter on the node property (n.id IN $ids) instead of id(n) and pass the same params key (ids) so callers can use identical id lists for get() and delete().coordinode/client.py (1)
289-299: 🧹 Nitpick | 🔵 Trivial
health()swallows all gRPC errors — consider logging or distinguishing transient failures.Returning
Falsefor anygrpc.RpcErroris reasonable for a simple health check, but transient network issues become indistinguishable from a genuinely unhealthy server. Consider adding debug-level logging or documenting thatFalsemeans "unreachable or unhealthy."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coordinode/client.py` around lines 289 - 299, The health() method currently swallows all grpc.RpcError and returns False; change it to catch grpc.RpcError as e and emit a debug-level log including the exception, its gRPC status code (e.code()) and details (e.details()) to distinguish transient network errors from a true unhealthy service, e.g., inside health() around the _health_stub.Check call log with logger.debug("health check failed", exc_info=e, status=e.code(), details=e.details()); if no module logger exists, create one with logging.getLogger(__name__).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/release-please.yml:
- Around line 16-20: The release config only targets the root package; update
release-please-config.json by adding three package entries under the "packages"
object for coordinode, langchain-coordinode, and llama-index-coordinode (each
pointing to the correct package directory and including the desired
release-type/rules as the root entry uses) so release-please will create
separate release PRs/tags per workspace package; if you intentionally want a
single shared root version instead, add a short comment in
release-please-config.json (or the repo README) stating that design decision to
avoid confusion.
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 182-188: The upsert_nodes method interpolates label directly into
the Cypher string causing a potential injection vulnerability; change
upsert_nodes to escape the node label using the existing _cypher_ident helper
(call _cypher_ident(_node_label(node)) or equivalent) when building the cypher
variable and keep params for id and props as before so labels are safely
quoted/escaped while preserving the MERGE/SET logic in upsert_nodes.
---
Outside diff comments:
In `@coordinode/client.py`:
- Around line 289-299: The health() method currently swallows all grpc.RpcError
and returns False; change it to catch grpc.RpcError as e and emit a debug-level
log including the exception, its gRPC status code (e.code()) and details
(e.details()) to distinguish transient network errors from a true unhealthy
service, e.g., inside health() around the _health_stub.Check call log with
logger.debug("health check failed", exc_info=e, status=e.code(),
details=e.details()); if no module logger exists, create one with
logging.getLogger(__name__).
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 215-222: The delete() implementation in base.py uses the Neo4j
internal integer id(n) but get() filters by the node property n.id (string
adapter ID), causing mismatch; update the delete() method (the branch that
handles ids) to filter on the node property (n.id IN $ids) instead of id(n) and
pass the same params key (ids) so callers can use identical id lists for get()
and delete().
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 66312667-a2cf-47d0-b6c6-3f617f2b61a3
📒 Files selected for processing (8)
.github/workflows/release-please.yml.github/workflows/release.ymlMakefilecoordinode/__init__.pycoordinode/_types.pycoordinode/client.pyllama-index-coordinode/llama_index/graph_stores/coordinode/base.pytests/integration/test_basic.py
fix(llama-index): align delete() to use n.id (string) not id(n) (int graph ID) fix(client): add debug logging on gRPC health check failure docs(release): document single shared version intent in release-please-config.json
|
@coderabbitai Re: outside-diff findings in review:
|
There was a problem hiding this comment.
♻️ Duplicate comments (1)
llama-index-coordinode/llama_index/graph_stores/coordinode/base.py (1)
93-104:⚠️ Potential issue | 🟡 MinorParameter name collision risk remains unaddressed.
The past review correctly identified that if
propertiescontains keys that sanitize to the same name (e.g.,{"a-b": 1, "a_b": 2}), theparam_mapdict will silently drop one value. This bug wasn't fixed.🛡️ Suggested fix using indexed parameters
elif properties: - # Build param mapping: safe Cypher param name → original value. - # Property keys may contain hyphens or other chars invalid in - # Cypher parameter names; _cypher_param_name() sanitises them. - param_map = {_cypher_param_name(k): v for k, v in properties.items()} - where_clauses = " AND ".join(f"n.{_cypher_ident(k)} = ${_cypher_param_name(k)}" for k in properties) + # Use indexed parameter names to avoid collisions when different + # property keys sanitize to the same Cypher parameter name. + param_map: dict[str, Any] = {} + clauses: list[str] = [] + for idx, (k, v) in enumerate(properties.items()): + pname = f"p{idx}" + clauses.append(f"n.{_cypher_ident(k)} = ${pname}") + param_map[pname] = v + where_clauses = " AND ".join(clauses) cypher = f"MATCH (n) WHERE {where_clauses} RETURN n, n.id AS _nid LIMIT 1000"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py` around lines 93 - 104, The code building param_map and where_clauses can collide when different property keys sanitize to the same _cypher_param_name (e.g., "a-b" and "a_b"); change the logic in the properties handling so you generate unique param names per original key (for example by appending an index or counter to _cypher_param_name(k)), build param_map with those unique names mapped to the corresponding values, and use those exact unique names in where_clauses (constructed alongside the param naming) so the call to self._client.cypher(cypher, params=param_map) receives all values; update references in this block (properties, param_map, _cypher_param_name, where_clauses, _cypher_ident, cypher, self._client.cypher, _node_result_to_labelled) accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@llama-index-coordinode/llama_index/graph_stores/coordinode/base.py`:
- Around line 93-104: The code building param_map and where_clauses can collide
when different property keys sanitize to the same _cypher_param_name (e.g.,
"a-b" and "a_b"); change the logic in the properties handling so you generate
unique param names per original key (for example by appending an index or
counter to _cypher_param_name(k)), build param_map with those unique names
mapped to the corresponding values, and use those exact unique names in
where_clauses (constructed alongside the param naming) so the call to
self._client.cypher(cypher, params=param_map) receives all values; update
references in this block (properties, param_map, _cypher_param_name,
where_clauses, _cypher_ident, cypher, self._client.cypher,
_node_result_to_labelled) accordingly.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: eb8d55b7-98e8-4992-accb-29e4d31cce9e
📒 Files selected for processing (3)
coordinode/__init__.pycoordinode/_types.pyllama-index-coordinode/llama_index/graph_stores/coordinode/base.py
VectorServiceImpl now executes Cypher with vector_distance() ORDER BY LIMIT instead of returning an empty stub response. Remove xfail marker.
GraphService now persists nodes and edges via Cypher. Remove xfail markers and stale comments about stub behaviour.
- test_hybrid_search_returns_results: creates two nodes connected by RELATED edge with embeddings, calls hybrid_search from start node, asserts results with distance and node fields - test_get_schema_text: now creates actual data (label + edge type) and verifies SchemaTestLabel and SCHEMA_EDGE appear in schema text output - Remove stale "graph RPC stubs not yet wired" comment in async test 25/25 integration tests pass
…, and sed idempotency - base.py: use indexed params (p0, p1, …) in get() to prevent collision when different property keys sanitize to the same name - client.py: raise ValueError when explicit port conflicts with port embedded in host string - Makefile: add PYTHON ?= python3; use $(PYTHON) -m grpc_tools.protoc - Makefile: tighten sed to match only v1. prefix so make proto is idempotent (repeated runs no longer double-rewrite imports) - _types.py: use X | Y union syntax in isinstance() (UP038) - test_sdk.py: drop bare f-string prefix (F541)
|
@coderabbitai re-review |
|
🧠 Learnings used✅ Actions performedFull review triggered. |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 24 out of 27 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
tests/unit/test_types.py (1)
38-41:⚠️ Potential issue | 🟡 Minor
_FakeMap.fieldsshould be_FakeMap.entriesto match proto accessor.The production code in
coordinode/_types.py(context snippet 2, line 79) usespv.map_value.entries.items(), but_FakeMapexposes.fields. If any test uses_FakeMapwithfrom_property_value, it will fail due to the attribute mismatch.🐛 Proposed fix
class _FakeMap: def __init__(self, fields): - self.fields = dict(fields) + self.entries = dict(fields)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/unit/test_types.py` around lines 38 - 41, The test helper class _FakeMap exposes a .fields attribute but production code expects proto-style .entries (see pv.map_value.entries.items() and from_property_value usage); change _FakeMap to expose .entries (likely a dict or mapping) instead of .fields so tests calling from_property_value or code that accesses pv.map_value.entries will find the correct attribute; update any constructor or references in the test to populate _FakeMap.entries with the same dict that was previously assigned to .fields.coordinode/client.py (1)
331-364:⚠️ Potential issue | 🔴 CriticalMake the sync wrapper manage its own lifecycle.
CoordinodeClientstill only initializes stubs in__enter__, and it still has no publicclose(). The newCoordinodeGraphandCoordinodePropertyGraphStoreinstantiate this class directly and later callclose(), so their first RPC will hit uninitialized stubs and their teardown path will raiseAttributeError. Add lazyconnect()in_run()and a realclose()that__exit__reuses.Proposed fix
class CoordinodeClient: def __init__( self, host: str = "localhost", port: int = 7080, @@ ) -> None: self._async = AsyncCoordinodeClient(host, port, tls=tls, timeout=timeout) self._loop = asyncio.new_event_loop() + self._connected = False def __enter__(self) -> CoordinodeClient: - self._loop.run_until_complete(self._async.connect()) + if not self._connected: + self._loop.run_until_complete(self._async.connect()) + self._connected = True return self def __exit__(self, *_: Any) -> None: - self._loop.run_until_complete(self._async.close()) - self._loop.close() + self.close() def _run(self, coro: Any) -> Any: + if not self._connected: + self._loop.run_until_complete(self._async.connect()) + self._connected = True return self._loop.run_until_complete(coro) + + def close(self) -> None: + if self._connected: + self._loop.run_until_complete(self._async.close()) + self._connected = False + if not self._loop.is_closed(): + self._loop.close()🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@coordinode/client.py` around lines 331 - 364, CoordinodeClient must manage its own lifecycle: make _run ensure the async client is connected before awaiting (call self._async.connect() via the event loop if not already connected) so first RPCs don't hit uninitialized stubs; add a public close(self) that runs self._async.close() on the event loop and then closes the loop (idempotent/safe if already closed); update __exit__ to call this new close() and keep __enter__ to call connect as it already does. Ensure checks use the AsyncCoordinodeClient instance/state (self._async) to determine connection status and avoid re-creating loops or raising AttributeError.langchain-coordinode/langchain_coordinode/graph.py (1)
105-167:⚠️ Potential issue | 🟠 Major
structured_schemano longer matches the schema text the client emits.
refresh_schema()feeds this parser the string fromCoordinodeClient.get_schema_text(), but that method now emitsNode labels:/Edge types:bullet lines with inline(properties: ...). This parser only recognizes arelationship typesheader plus separateprop: TYPE/A -> Blines, so node properties, relationship properties, andrelationshipsall stay empty after every refresh. Buildstructured_schemafrom the schema RPC responses directly, or update the text format and parser together.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@langchain-coordinode/langchain_coordinode/graph.py` around lines 105 - 167, The parser _parse_schema no longer matches the text emitted by CoordinodeClient.get_schema_text (which now uses "Node labels:" and "Edge types:" with inline "(properties: ...)" and different bullets), causing structured_schema populated by refresh_schema to be empty; update _parse_schema to recognize the new headers ("Node labels:" and "Edge types:"), parse inline property lists in bullet lines (e.g., extract label/edge name and parse "(properties: prop1: TYPE, prop2: TYPE)" into node_props and rel_props), and still collect relationships from inline "A -> B" fragments, or else change refresh_schema to build structured_schema from the schema RPC response directly; locate and update _parse_schema (and refresh_schema/CoordinodeClient.get_schema_text usage) to ensure node_props, rel_props, and relationships are correctly populated from the new text format.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@coordinode/client.py`:
- Around line 114-126: The constructor currently can't distinguish an omitted
port from an explicit 7080; change the port parameter default in
AsyncCoordinodeClient.__init__ and CoordinodeClient.__init__ to use None (port:
Optional[int] = None) and update the host parsing logic: use _HOST_PORT_RE to
extract parsed_port, and if parsed_port exists then if port is None set port =
parsed_port else if port != parsed_port raise ValueError("Conflicting
ports..."); after parsing, if port is still None set port = 7080 as the final
default; adjust any type hints/usages of port accordingly (references:
AsyncCoordinodeClient.__init__, CoordinodeClient.__init__, host, port,
_HOST_PORT_RE, parsed_port).
In `@tests/integration/test_sdk.py`:
- Around line 91-120: These integration tests (e.g.,
test_create_node_rpc_returns_id, test_create_node_rpc_persists,
test_get_node_rpc, test_create_edge_rpc and the other ranges noted) assume
backend graph/RPC features that may be unavailable; add a capability probe or
conditional skip/xfail around these tests: implement a helper (or reuse an
existing fixture) that calls get_schema_text()/a minimal RPC probe to detect
whether create_node/create_edge/get_node and Cypher are actually supported, and
then either mark the tests with pytest.mark.xfail or call pytest.skip when the
probe fails; alternatively update CI to pin a server image that supports these
features and remove the gating. Ensure references to create_node, create_edge,
get_node and get_schema_text are used in the probe so the tests only run when
the backend supports them.
---
Outside diff comments:
In `@coordinode/client.py`:
- Around line 331-364: CoordinodeClient must manage its own lifecycle: make _run
ensure the async client is connected before awaiting (call self._async.connect()
via the event loop if not already connected) so first RPCs don't hit
uninitialized stubs; add a public close(self) that runs self._async.close() on
the event loop and then closes the loop (idempotent/safe if already closed);
update __exit__ to call this new close() and keep __enter__ to call connect as
it already does. Ensure checks use the AsyncCoordinodeClient instance/state
(self._async) to determine connection status and avoid re-creating loops or
raising AttributeError.
In `@langchain-coordinode/langchain_coordinode/graph.py`:
- Around line 105-167: The parser _parse_schema no longer matches the text
emitted by CoordinodeClient.get_schema_text (which now uses "Node labels:" and
"Edge types:" with inline "(properties: ...)" and different bullets), causing
structured_schema populated by refresh_schema to be empty; update _parse_schema
to recognize the new headers ("Node labels:" and "Edge types:"), parse inline
property lists in bullet lines (e.g., extract label/edge name and parse
"(properties: prop1: TYPE, prop2: TYPE)" into node_props and rel_props), and
still collect relationships from inline "A -> B" fragments, or else change
refresh_schema to build structured_schema from the schema RPC response directly;
locate and update _parse_schema (and
refresh_schema/CoordinodeClient.get_schema_text usage) to ensure node_props,
rel_props, and relationships are correctly populated from the new text format.
In `@tests/unit/test_types.py`:
- Around line 38-41: The test helper class _FakeMap exposes a .fields attribute
but production code expects proto-style .entries (see
pv.map_value.entries.items() and from_property_value usage); change _FakeMap to
expose .entries (likely a dict or mapping) instead of .fields so tests calling
from_property_value or code that accesses pv.map_value.entries will find the
correct attribute; update any constructor or references in the test to populate
_FakeMap.entries with the same dict that was previously assigned to .fields.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: e6ac59ef-a1b2-4286-8f8a-cd426af322c9
⛔ Files ignored due to path filters (2)
assets/usdt-qr.svgis excluded by!**/*.svguv.lockis excluded by!**/*.lock
📒 Files selected for processing (25)
.github/workflows/ci.yml.github/workflows/release-please.yml.github/workflows/release.yml.gitignore.release-please-manifest.jsonMakefileREADME.mdcoordinode/README.mdcoordinode/__init__.pycoordinode/_proto/__init__.pycoordinode/_types.pycoordinode/client.pycoordinode/pyproject.tomllangchain-coordinode/README.mdlangchain-coordinode/langchain_coordinode/graph.pylangchain-coordinode/pyproject.tomlllama-index-coordinode/README.mdllama-index-coordinode/llama_index/graph_stores/coordinode/base.pyllama-index-coordinode/pyproject.tomlpyproject.tomlrelease-please-config.jsonruff.tomltests/integration/test_basic.pytests/integration/test_sdk.pytests/unit/test_types.py
💤 Files with no reviewable changes (1)
- coordinode/_proto/init.py
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 24 out of 27 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
- AsyncCoordinodeClient: port default None → distinguishes omitted from
explicit 7080; fixes silent conflict when caller passes explicit port
matching 7080 but host string embeds different port
- CoordinodeClient: same port sentinel change; add lazy connect in _run()
so client is usable outside context manager; add public close() method
reused by __exit__ (fixes AttributeError in langchain/llama-index teardown)
- Host:port parsing tests: use AsyncCoordinodeClient directly (no event loop
created → no loop leak); add test for explicit-port-conflict raise
- test_sdk.py: remove stale "xfail until wired" section comment
- test_types.py: rename _FakeMap.fields → entries to match proto accessor
- langchain graph.py: rewrite _parse_schema to match get_schema_text() format
("Edge types:" header, inline "(properties: ...)" on bullet lines)
|
@coderabbitai Re T39: Fixed — port default changed to |
|
Tip For best results, initiate chat on the files or code changes.
Great work closing all of these out! 🧠 Learnings used |
… import - _types.py: replace isinstance(v, int | float) and isinstance(py_val, list | tuple) with tuple-of-types form (int, float) and (list, tuple); identical semantics on all Python versions, eliminates repeated Copilot false-positive lint hits - client.py: remove no-op 'from coordinode._proto import __init__ as _setup_path' (_proto/__init__.py has no sys.path side effects; import served no purpose)
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@coordinode/client.py`:
- Around line 367-371: The _run method currently calls
self._loop.run_until_complete even after close() may have closed the loop,
causing a RuntimeError: Event loop is closed; update _run to first check
self._loop.is_closed() and raise a clear, descriptive exception (e.g., "Client
has been closed and cannot be reused") if the loop is closed, and also ensure
the existing connected logic (self._connected and self._async.connect()) only
runs when the loop is open; reference the _run method, the _loop attribute, the
_connected flag, and the _async.connect() call when implementing this defensive
check.
In `@langchain-coordinode/langchain_coordinode/graph.py`:
- Around line 130-132: The relationships list (relationships) is initialized but
never populated, yet GraphCypherQAChain expects relationship triples like
{"start":"A","type":"REL","end":"B"}; add a clear TODO comment adjacent to the
relationships declaration (and similarly where other empty lists node_props /
rel_props are declared) explaining that the current CoordiNode schema text
format does not include start/type/end relationship triples and therefore
relationships are left empty, include the expected triple format and a note for
future implementers to parse/populate relationships if schema format changes or
to add a mapping function to produce those triples for GraphCypherQAChain.
In `@tests/integration/test_sdk.py`:
- Around line 30-35: The custom pytest fixture named "run" creates a manual
event loop which can conflict with pytest-asyncio and async tests using
AsyncCoordinodeClient; remove or stop exporting this manual loop and instead
rely on pytest-asyncio's built-in event loop fixture (or rename the manual
fixture to something like "sync_run" and only use it in synchronous tests).
Update any tests that currently depend on "run" to use the pytest-asyncio
"event_loop" fixture (or asyncio.run for one-off sync calls) so
`@pytest.mark.asyncio` tests and AsyncCoordinodeClient use the same event loop
management.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: cb52f1ee-55cf-4620-90ef-bac843920fc0
📒 Files selected for processing (6)
coordinode/_types.pycoordinode/client.pylangchain-coordinode/langchain_coordinode/graph.pytests/integration/test_basic.pytests/integration/test_sdk.pytests/unit/test_types.py
…emove dead fixture - client.py: raise RuntimeError in _run() when event loop is closed (prevents obscure 'Event loop is closed' error when client reused after close()) - langchain graph.py: add TODO comment on empty relationships list explaining schema text lacks direction info; document Cypher introspection path - test_sdk.py: remove unused run fixture (manual event loop, never referenced by any test); remove orphaned asyncio import
|
@coderabbitai Re #44: Fixed — added |
|
Tip For best results, initiate chat on the files or code changes.
|
…, fix formatting - refresh_schema() now populates relationships via MATCH (a)-[r]->(b) Cypher query since get_schema_text() only lists edge types without direction info - Remove TODO comment from _parse_schema (logic lives in refresh_schema) - Apply ruff format to tests/integration/test_sdk.py (blank line after fixture removal)
Summary
CoordinodeClient("localhost:7080")now works — constructor parseshost:portstringhealth()returnsbool— tests updated accordinglycypher()returnsList[Dict[str, Any]]— LangChainquery(), LlamaIndexstructured_query(),get(),get_triplets(),get_rel_map()all fixed to use dict row accessVectorResultfields:.node.id,.node.properties,.distance(not.node_id,.score)test_types.py:_FakePVvector kind"vector"→"vector_value"Closes #1