Skip to content

Add supply point map MVP with SQLite export and local OpenLayers UI#20

Merged
ochanuco merged 4 commits intomainfrom
pr/supply-point-map-mvp
Mar 19, 2026
Merged

Add supply point map MVP with SQLite export and local OpenLayers UI#20
ochanuco merged 4 commits intomainfrom
pr/supply-point-map-mvp

Conversation

@ochanuco
Copy link
Copy Markdown
Owner

@ochanuco ochanuco commented Mar 19, 2026

Summary

  • add a local map MVP for mart.rideoasis_supply_points with an OpenLayers frontend
  • add SQLite export and a read-only local API for supply point lookup
  • add GPX parsing and near-route filtering plus unit tests and docs

Testing

  • npm test
  • re-exported BigQuery data from chanukott.rideoasis_mart.rideoasis_supply_points
  • verified /api/supply-points returns GeoJSON from local SQLite

Closes #19

Summary by CodeRabbit

  • 新機能

    • 補給地点マップのフロントエンドを追加。GPXアップロード、地図表示、ルート上の候補検出(距離閾値・チェーン・最小レベルで絞り込み)、ポイント選択と詳細ポップアップ、再計算操作を提供。
    • ローカルMVP用のローカルサーバーとBigQuery→SQLiteエクスポート、起動用npmスクリプトを追加。
  • ドキュメント

    • ローカルワークフローとAPI操作フローをREADMEに追記。
  • テスト

    • GPX解析、経路計算、クエリ/永続化まわりのユニットテストを追加。
  • 雑務

    • .local を .gitignore に追加。Node.js 最小バージョンを指定。GitHub Actions のテスト行列を更新。

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 19, 2026

Warning

Rate limit exceeded

@ochanuco has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 10 minutes and 59 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 475b59b4-bab7-4b01-ae30-de8b0ac0f0f2

📥 Commits

Reviewing files that changed from the base of the PR and between 071df66 and fdebaa3.

📒 Files selected for processing (4)
  • frontend/app.js
  • lib/map_data.js
  • scripts/map_dev_server.js
  • tests/map_data.test.js

Walkthrough

OpenLayersフロントエンド、ブラウザ内GPXパーサ、ルート距離計算ユーティリティ、BigQuery→SQLiteエクスポートスクリプト、ローカル読み取り専用APIサーバー、および関連ユニットテストを追加し、GPXルートに近い供給地点を表示するローカルMVPを実装します。

Changes

Cohort / File(s) Summary
リポジトリ設定・ドキュメント
\.gitignore, README.md, package.json
.local/.gitignore に追加、README に frontend のローカルMVPワークフローと API 契約を追記、npm スクリプト export:map-dbmap:serve を追加、engines.node>=22.5.0 に設定
フロントエンド: HTML / スタイル / UI
frontend/index.html, frontend/styles.css, frontend/app.js
OpenLayers ベースの静的 UI を追加(GPX アップロード、距離閾値、chain/point_level フィルタ、ステータス、リスト、ポップアップ)。app.js は GPX 解析→API 取得→クライアント側距離フィルタ→描画の制御を実装
フロントエンド: ユーティリティ
frontend/gpx.js, frontend/route_math.js
UMD 互換の GPX パーサ(座標抽出・GeoJSON 生成)と RouteMath(bbox 計算、bbox 拡張、点→ルート最短距離をメートルで計算)を追加
バックエンド: データモデル & ヘルパー
lib/map_data.js
CLI 引数解析・入力バリデーション、BigQuery SELECT 生成、SQLite スキーマ/UPSERT 文、API フィルタ→SQL 生成、行→GeoJSON 変換などのユーティリティを集中化して追加
バックエンド: スクリプト & サーバー
scripts/export_map_db.js, scripts/map_dev_server.js
BigQuery からローカル SQLite へエクスポートする CLI(dry-run・トランザクション・正規化)と、静的ファイル配信と GET /api/supply-points を提供するローカル開発サーバーを追加
テスト
tests/map_data.test.js, tests/route_math.test.js
引数解析、bbox/チェーン/制約のパース、BigQuery SQL 生成、データ正規化と upsert、GPX パース、bbox/距離計算、GeoJSON 出力を検証するユニットテストを追加
CI
.github/workflows/unit-tests.yml
GitHub Actions のテストマトリクスを Node.js 22/24 に更新

Sequence Diagram(s)

sequenceDiagram
    participant User as ユーザー
    participant Frontend as Frontend (app.js)
    participant GpxParser as GPX Parser (gpx.js)
    participant RouteMath as RouteMath (route_math.js)
    participant ApiServer as API Server (map_dev_server.js)
    participant Sqlite as SQLite DB

    User->>Frontend: GPXファイルをアップロード
    Frontend->>GpxParser: parseGpxText(gpx)
    GpxParser-->>Frontend: LineString GeoJSON
    Frontend->>RouteMath: computeBbox() / expandBbox()
    RouteMath-->>Frontend: 拡張BBox
    Frontend->>ApiServer: GET /api/supply-points?bbox=...&chains=...&min_point_level=...
    ApiServer->>Sqlite: パラメータ化されたSELECTを実行
    Sqlite-->>ApiServer: 候補行群
    ApiServer-->>Frontend: FeatureCollection (GeoJSON)
    Frontend->>RouteMath: pointToRouteDistanceMeters()(各候補)
    RouteMath-->>Frontend: 距離(m)
    Frontend-->>User: 閾値でフィルタした供給地点を地図とリストに表示
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

Poem

🐇 うさぎのひとこと
GPX を抱えて跳ねるよ 🗺️
BigQuery は旅して SQLite に着地
地図の上で供給地点がぴょんと光る ✨
ポップアップでごあいさつ、さぁ行こう!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and concisely describes the main change: adding a supply point map MVP with SQLite export and OpenLayers UI, which matches the core objectives.
Linked Issues check ✅ Passed The PR meets all coding requirements from issue #19: SQLite export script with proper schema, read-only local API with required endpoints and filtering, GPX parsing, point-to-route distance calculation, and OpenLayers frontend with all specified UI features and interactions.
Out of Scope Changes check ✅ Passed All changes are within scope: .gitignore/.github additions for local files and CI configuration, documentation updates, and implementation of all MVP components specified in issue #19. No unrelated modifications detected.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch pr/supply-point-map-mvp
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

🧹 Nitpick comments (4)
scripts/map_dev_server.js (1)

33-43: 内部エラーまで 400 にしない方が良いです。

ここは入力バリデーション失敗と SQLite 実行失敗が同じ 400 になります。DB ファイル破損やスキーマ不整合までクライアント責任に見えるので、4xx と 5xx を分けた方が切り分けやすいです。

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/map_dev_server.js` around lines 33 - 43, The catch currently maps all
errors to 400, which hides server-side failures; update the error handling
around parseSupplyPointFilters/buildSupplyPointsQuery/database.prepare(...).all
so validation errors return 400 while database/runtime errors return 500:
introduce or use a specific error type/name (e.g., ValidationError) thrown by
parseSupplyPointFilters (or detect known validation conditions), then in the
catch check error instanceof ValidationError or error.name === 'ValidationError'
and respond with 400, otherwise respond with 500 and include the error message;
ensure the response content-type and HEAD handling remain the same.
package.json (1)

16-17: サポート Node を明示しておきたいです。

少なくとも map:servescripts/map_dev_server.jsnode:sqlite に依存するので、対応していない Node では起動直後に失敗します。package.jsonengines か README/CI のどこかで前提 Node を固定しておくと、再現性がかなり上がります。

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@package.json` around lines 16 - 17, 明示的な Node バージョンが無いため
`map:serve`(scripts/map_dev_server.js が node:sqlite に依存)で非対応 Node
だと起動直後に失敗します;package.json に `engines` を追加してサポートする Node の最低/推奨バージョン(例: "node":
">=18" など)を明記するか、README と CI 設定に同等の前提を追記して再現性を担保してください。参照箇所: package.json (キー:
"scripts" の "map:serve") とスクリプトファイル scripts/map_dev_server.js(node:sqlite
依存)を更新してください。
frontend/app.js (1)

132-138: escapeHtmlにシングルクォートのエスケープがありません。

現在の使用箇所(ダブルクォートで囲まれた属性)では安全ですが、より防御的な実装として'のエスケープ追加を検討してください。

🛡️ シングルクォートのエスケープ追加
 function escapeHtml(value) {
   return String(value)
     .replaceAll('&', '&')
     .replaceAll('<', '&lt;')
     .replaceAll('>', '&gt;')
-    .replaceAll('"', '&quot;');
+    .replaceAll('"', '&quot;')
+    .replaceAll("'", '&#39;');
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app.js` around lines 132 - 138, The escapeHtml function is missing
escaping for single quotes which can lead to attribute injection in some
contexts; update the escapeHtml implementation to also replace single-quote
characters (') with an HTML entity (e.g., &#39; ) so replaceAll("'", "&#39;") is
applied alongside the existing replacements in the escapeHtml function to ensure
single quotes are safely encoded.
scripts/export_map_db.js (1)

35-42: execFileSyncにタイムアウトの設定がありません。

BigQueryクエリが長時間かかる場合、スクリプトが無期限にハングする可能性があります。MVPでは許容できますが、将来的にはtimeoutオプションの追加を検討してください。

📝 タイムアウト追加の提案
 function fetchMartRows(options) {
   const args = buildBqArgs(options);
   const output = execFileSync('bq', args, {
     encoding: 'utf8',
-    maxBuffer: 256 * 1024 * 1024
+    maxBuffer: 256 * 1024 * 1024,
+    timeout: 10 * 60 * 1000 // 10分
   });
   return JSON.parse(output);
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/export_map_db.js` around lines 35 - 42, The fetchMartRows function
uses execFileSync without a timeout, which can hang on long BigQuery queries;
update fetchMartRows to accept or compute a timeout (e.g., a
DEFAULT_BQ_TIMEOUT_MS constant or an options.timeout) and pass it into the
execFileSync call's options object alongside encoding and maxBuffer so the child
process is killed after the timeout; keep the existing maxBuffer and preserve
JSON.parse(output) behavior in fetchMartRows and ensure buildBqArgs usage
remains unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@frontend/app.js`:
- Around line 140-153: buildPopupHtml currently uses props.source_url directly
in an href which allows dangerous schemes (e.g., javascript: or data:) and can
lead to XSS; update buildPopupHtml to validate the URL scheme before creating
the anchor: parse props.source_url, allow only safe schemes such as http and
https (or an explicit allowlist), and only set the href when the scheme is
allowed; for disallowed or invalid URLs, render a safe fallback (e.g., '-' or
the escaped URL as plain text) and continue to use escapeHtml for all inserted
values to ensure output is escaped.

In `@frontend/gpx.js`:
- Around line 8-20: parseCoordinateTokens currently uses POINT_RE which only
matches when lat="..." appears before lon="..." and only double quotes; this
misses valid GPX points. Replace the regex approach inside
parseCoordinateTokens: parse gpxText with a proper XML parser (e.g., DOMParser)
or a robust pattern that finds each <trkpt>/<rtept> element and reads its lat
and lon attributes (accepting single or double quotes and any attribute order),
then push [lng, lat] into coords only when both attributes are present and
numeric; update references to POINT_RE and gpxText accordingly and remove
reliance on attribute ordering.

In `@lib/map_data.js`:
- Around line 127-147: The buildBqSelectSql function currently interpolates the
raw project parameter into the FROM clause without validation; add a BigQuery
project ID validator (e.g., validateProjectId) that enforces allowed characters
and format (letters, digits, hyphens, proper length/position rules) and call it
at the start of buildBqSelectSql to produce a safe validatedProject variable;
then use validatedProject instead of project in the FROM template and throw or
log a clear error if validation fails so untrusted input cannot inject invalid
project identifiers.

In `@README.md`:
- Around line 156-158: The README's API parameter names don't match the
implementation: update the GET /api/supply-points documentation to list the
actual parameter names used by the server—replace `chain` with `chains` and
replace `min_point_level` with `limit` (keep `bbox` if still supported) so the
docs read something like bbox / chains / limit to match the implementation.

In `@scripts/export_map_db.js`:
- Around line 1-13: The code imports DatabaseSync from 'node:sqlite' in
export_map_db.js (const { DatabaseSync } = require('node:sqlite')) and
tests/map_data.test.js also rely on this; because 'node:sqlite' requires Node.js
>=22.5.0 you must make the project and CI consistent: either (A) update the
GitHub Actions matrix to only test Node >=22.5.0, or (B) set package.json
"engines": {"node":" >=22.5.0"} to declare the requirement, or (C) replace the
'node:sqlite' usage (DatabaseSync and related DB operations in export_map_db.js
and any helpers in map_data) with a Node-20-compatible sqlite library; pick one
approach and apply it consistently to both export_map_db.js (remove/replace
DatabaseSync import if choosing C) and tests/map_data.test.js so CI and runtime
use a compatible Node/sqilte combination.

In `@scripts/map_dev_server.js`:
- Around line 79-80: The code currently constructs a URL using the
client-supplied request.headers.host which can be malicious and cause new
URL(...) to throw; update the server request handler so it uses a fixed base
(e.g. 'http://localhost') instead of request.headers.host when creating
requestUrl, wrap the URL parsing in a try/catch (the callback that creates
requestUrl) and respond with a 400 Bad Request if parsing fails, and ensure any
references to requestUrl (used later in the handler) continue to work with the
safely parsed URL.

In `@tests/map_data.test.js`:
- Line 3: The test imports node:sqlite (const { DatabaseSync } =
require('node:sqlite')) which is only available in Node >=22.5 and causes
ERR_UNKNOWN_BUILTIN_MODULE on older CI nodes; either update the CI Node.js
matrix to remove Node 20 and ensure all runners use >=22.5, or replace this
built-in usage by switching to a compatible npm driver (e.g., install and
require better-sqlite3 and update tests that use DatabaseSync to use the
better-sqlite3 API), making sure to update any setup/teardown in tests that
reference DatabaseSync.

---

Nitpick comments:
In `@frontend/app.js`:
- Around line 132-138: The escapeHtml function is missing escaping for single
quotes which can lead to attribute injection in some contexts; update the
escapeHtml implementation to also replace single-quote characters (') with an
HTML entity (e.g., &#39; ) so replaceAll("'", "&#39;") is applied alongside the
existing replacements in the escapeHtml function to ensure single quotes are
safely encoded.

In `@package.json`:
- Around line 16-17: 明示的な Node バージョンが無いため `map:serve`(scripts/map_dev_server.js
が node:sqlite に依存)で非対応 Node だと起動直後に失敗します;package.json に `engines` を追加してサポートする
Node の最低/推奨バージョン(例: "node": ">=18" など)を明記するか、README と CI
設定に同等の前提を追記して再現性を担保してください。参照箇所: package.json (キー: "scripts" の "map:serve")
とスクリプトファイル scripts/map_dev_server.js(node:sqlite 依存)を更新してください。

In `@scripts/export_map_db.js`:
- Around line 35-42: The fetchMartRows function uses execFileSync without a
timeout, which can hang on long BigQuery queries; update fetchMartRows to accept
or compute a timeout (e.g., a DEFAULT_BQ_TIMEOUT_MS constant or an
options.timeout) and pass it into the execFileSync call's options object
alongside encoding and maxBuffer so the child process is killed after the
timeout; keep the existing maxBuffer and preserve JSON.parse(output) behavior in
fetchMartRows and ensure buildBqArgs usage remains unchanged.

In `@scripts/map_dev_server.js`:
- Around line 33-43: The catch currently maps all errors to 400, which hides
server-side failures; update the error handling around
parseSupplyPointFilters/buildSupplyPointsQuery/database.prepare(...).all so
validation errors return 400 while database/runtime errors return 500: introduce
or use a specific error type/name (e.g., ValidationError) thrown by
parseSupplyPointFilters (or detect known validation conditions), then in the
catch check error instanceof ValidationError or error.name === 'ValidationError'
and respond with 400, otherwise respond with 500 and include the error message;
ensure the response content-type and HEAD handling remain the same.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: a42ba3b3-cb96-4918-b026-bd5a5d820fb5

📥 Commits

Reviewing files that changed from the base of the PR and between d79e4ca and 98397bc.

📒 Files selected for processing (13)
  • .gitignore
  • README.md
  • frontend/app.js
  • frontend/gpx.js
  • frontend/index.html
  • frontend/route_math.js
  • frontend/styles.css
  • lib/map_data.js
  • package.json
  • scripts/export_map_db.js
  • scripts/map_dev_server.js
  • tests/map_data.test.js
  • tests/route_math.test.js

Comment thread frontend/app.js
Comment thread frontend/gpx.js Outdated
Comment thread lib/map_data.js
Comment thread README.md
Comment thread scripts/export_map_db.js
Comment thread scripts/map_dev_server.js Outdated
Comment thread tests/map_data.test.js
@ochanuco
Copy link
Copy Markdown
Owner Author

@coderabbitai review

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 19, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@frontend/app.js`:
- Around line 269-272: The frontend currently hardcodes
params.set('limit','10000') and fetches `${API_BASE}/supply-points?...` which,
because lib/map_data.js orders results with "ORDER BY chain, name,
supply_point_id LIMIT :limit", causes nearby points beyond the first 10,000 to
be dropped; change the client to detect when the server returned exactly the
limit and then page or split the bbox and re-request (e.g., add offset/limit
query params or iteratively subdivide bbox and merge results) until a response
returns fewer than the limit (or use a server-provided overflow
flag/total_count), and update the code around params/response handling in
frontend/app.js to loop/fetch additional pages or sub-bboxes rather than relying
on a single 10,000-row request.
- Around line 259-268: selectedChains() が空配列のときに params
に何もセットしていないためバックエンド側で「全チェーン」を意味してしまいます。selectedChains() の結果をチェックして chains.length
=== 0 の場合は params.set('chains', '') のように明示的に空の chains
パラメータを追加して、フロント側から「チェーン無し(0件)」を送るようにしてください(対象箇所:selectedChains(), params /
URLSearchParams の扱い)。

In `@lib/map_data.js`:
- Around line 270-289: The parser allows arbitrarily large limit values,
enabling heavy requests; update parseSupplyPointFilters to enforce a sane
maximum for limit by checking the parsed value from parsePositiveInt (or add an
upper-bound check immediately after) and reject or clamp values above a defined
MAX_LIMIT constant (e.g., compare against DEFAULT_LIMIT and a new MAX_LIMIT);
reference parsePositiveInt, parseSupplyPointFilters, DEFAULT_LIMIT and
introduce/use MAX_LIMIT so requests like ?limit=1000000000 are refused or capped
before the data is fetched and serialized.

In `@scripts/map_dev_server.js`:
- Around line 72-75: The current static file serving uses
fs.createReadStream(filePath).pipe(response) without handling stream errors;
wrap the ReadStream usage returned by fs.createReadStream(filePath) and attach
'open' and 'error' handlers (on the stream object returned) to handle races
where the file vanishes after existsSync/statSync; on 'open' pipe to response
and on 'error' write an appropriate status (404 for ENOENT, 500 otherwise) and
end the response, and ensure you do not call response.writeHead twice (use the
MIME_TYPES lookup for content-type only after the 'open' or set a safe default
header before piping). Reference symbols: createReadStream, filePath,
MIME_TYPES, response.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 61788140-8c84-4fc6-b75e-02b79c67748a

📥 Commits

Reviewing files that changed from the base of the PR and between 98397bc and 8cfbfcb.

📒 Files selected for processing (10)
  • .github/workflows/unit-tests.yml
  • README.md
  • frontend/app.js
  • frontend/gpx.js
  • lib/map_data.js
  • package.json
  • scripts/export_map_db.js
  • scripts/map_dev_server.js
  • tests/map_data.test.js
  • tests/route_math.test.js
✅ Files skipped from review due to trivial changes (2)
  • .github/workflows/unit-tests.yml
  • README.md
🚧 Files skipped from review as they are similar to previous changes (4)
  • package.json
  • tests/route_math.test.js
  • frontend/gpx.js
  • scripts/export_map_db.js

Comment thread frontend/app.js
Comment thread frontend/app.js Outdated
Comment thread lib/map_data.js
Comment thread scripts/map_dev_server.js Outdated
@ochanuco
Copy link
Copy Markdown
Owner Author

@coderabbitai review

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Mar 19, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (4)
scripts/map_dev_server.js (1)

70-79: ⚠️ Potential issue | 🟠 Major

静的ファイルの TOCTOU / stream error を処理できていません。

existsSync() / statSync() の直後にファイルが消えると statSync() 例外、GET 側は createReadStream()error でもそのまま request handler から落ちます。フロント資産を編集中でも起きるので、statSynctry/catch で包みつつ stream の open/error を拾って 404/500 にフォールバックしてください。

Also applies to: 101-110

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@scripts/map_dev_server.js` around lines 70 - 79, Wrap the sync stat check in
a try/catch around fs.statSync(filePath) (after the existsSync check) and treat
exceptions as a 404 fallback; when creating the read stream with
fs.createReadStream(filePath) attach 'open' and 'error' handlers so that on
'error' you write a 500 (or 404 if ENOENT) via response.writeHead/response.end
and on 'open' you pipe the stream to response; ensure MIME_TYPES/path.extname
logic remains and avoid throwing from the request handler by handling all
fs/stat/stream errors (references: filePath, fs.existsSync, fs.statSync,
fs.createReadStream, MIME_TYPES, path.extname, response.writeHead,
response.end).
frontend/app.js (2)

273-283: ⚠️ Potential issue | 🟠 Major

全チェーンを外すと 0 件ではなく全件検索になります。

selectedChains() が空のとき chains を送らないため、API 側ではチェーン条件が消えて全件取得になります。UI の「全部オフ」はここで 0 件扱いにしてください。

修正イメージ
 async function fetchCandidatePoints(distanceMeters) {
   const chains = selectedChains();
+  if (chains.length === 0) {
+    return { type: 'FeatureCollection', features: [] };
+  }
   const minPointLevel = Number(elements.minPointLevel.value) || 8;
   const bbox = expandedBboxForQuery(distanceMeters);
   const params = new URLSearchParams();
   if (bbox) {
     params.set('bbox', bbox.join(','));
   }
-  if (chains.length > 0) {
-    params.set('chains', chains.join(','));
-  }
+  params.set('chains', chains.join(','));
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app.js` around lines 273 - 283, selectedChains() being empty
currently omits the 'chains' query param so the API treats it as "no filter" and
returns all results; instead we must send an explicit value that indicates "no
chains selected". In fetchCandidatePoints, always set params.set('chains', ...)
— use chains.join(',') when chains.length>0 and set params.set('chains','') (or
a documented sentinel like 'NONE' if the backend expects it) when
chains.length===0 so the server can interpret "全部オフ" as zero results; update the
logic around selectedChains and params in fetchCandidatePoints accordingly.

284-291: ⚠️ Potential issue | 🟠 Major

固定 limit=10000 の単発 fetch だと近い地点を取りこぼします。

サーバー側 (lib/map_data.js) は ORDER BY chain, name, supply_point_id LIMIT :limit なので、bbox 内が 10,000 件を超えると client-side 距離判定に回るのは先頭 10,000 件だけです。features.length === limit を overflow と見なしてページングか bbox 分割を入れないと、長い GPX / 広い閾値で欠落します。

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/app.js` around lines 284 - 291, The client uses a fixed
params.set('limit', "10000") and does a single fetch
(`fetch(`${API_BASE}/supply-points?${params.toString()}`)`), which drops points
when the bbox contains >limit results; change the fetch logic in frontend/app.js
around that call to detect overflow (when returned features.length === limit)
and handle it by either implementing paging or recursively subdividing the bbox
and issuing additional fetches until each sub-bbox returns <limit; replace the
hardcoded '10000' with a LIMIT constant and add re-fetch logic that merges
results before distance-based filtering so no nearby points are lost (operate on
function/scope that calls the fetch and merges results).
lib/map_data.js (1)

282-303: ⚠️ Potential issue | 🟠 Major

API の limit に上限がありません。

?limit=1000000000 のような値がそのまま通るので、database.prepare(...).all()JSON.stringify() でローカルでも簡単に重くなります。parseSupplyPointFilters() 側で sane max を拒否か clamp してください。

修正イメージ
 const DEFAULT_MIN_POINT_LEVEL = 8;
 const DEFAULT_LIMIT = 5000;
+const MAX_LIMIT = 10000;
 ...
-function parsePositiveInt(value, label, fallback) {
+function parsePositiveInt(value, label, fallback, max = Number.MAX_SAFE_INTEGER) {
   if (value == null || value === '') return fallback;
   const parsed = Number(value);
-  if (!Number.isInteger(parsed) || parsed <= 0) {
-    throw new ValidationError(`${label} must be a positive integer`);
+  if (!Number.isInteger(parsed) || parsed <= 0 || parsed > max) {
+    throw new ValidationError(`${label} must be a positive integer <= ${max}`);
   }
   return parsed;
 }
 ...
-    limit: parsePositiveInt(searchParams.get('limit'), 'limit', DEFAULT_LIMIT)
+    limit: parsePositiveInt(searchParams.get('limit'), 'limit', DEFAULT_LIMIT, MAX_LIMIT)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@lib/map_data.js` around lines 282 - 303, The limit query currently has no
upper bound and can cause heavy DB/JSON work; add a sane maximum (e.g.,
MAX_LIMIT) and enforce it when parsing the limit in parseSupplyPointFilters:
either clamp the parsed value to Math.min(parsedLimit, MAX_LIMIT) or validate
and throw a ValidationError if parsedLimit > MAX_LIMIT. You can implement this
by adding a MAX_LIMIT constant and applying the check/Clamp specifically for the
'limit' field in parseSupplyPointFilters (or extend parsePositiveInt to accept
an optional max parameter) so DEFAULT_LIMIT and other ints remain unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@frontend/app.js`:
- Around line 273-283: selectedChains() being empty currently omits the 'chains'
query param so the API treats it as "no filter" and returns all results; instead
we must send an explicit value that indicates "no chains selected". In
fetchCandidatePoints, always set params.set('chains', ...) — use
chains.join(',') when chains.length>0 and set params.set('chains','') (or a
documented sentinel like 'NONE' if the backend expects it) when
chains.length===0 so the server can interpret "全部オフ" as zero results; update the
logic around selectedChains and params in fetchCandidatePoints accordingly.
- Around line 284-291: The client uses a fixed params.set('limit', "10000") and
does a single fetch (`fetch(`${API_BASE}/supply-points?${params.toString()}`)`),
which drops points when the bbox contains >limit results; change the fetch logic
in frontend/app.js around that call to detect overflow (when returned
features.length === limit) and handle it by either implementing paging or
recursively subdividing the bbox and issuing additional fetches until each
sub-bbox returns <limit; replace the hardcoded '10000' with a LIMIT constant and
add re-fetch logic that merges results before distance-based filtering so no
nearby points are lost (operate on function/scope that calls the fetch and
merges results).

In `@lib/map_data.js`:
- Around line 282-303: The limit query currently has no upper bound and can
cause heavy DB/JSON work; add a sane maximum (e.g., MAX_LIMIT) and enforce it
when parsing the limit in parseSupplyPointFilters: either clamp the parsed value
to Math.min(parsedLimit, MAX_LIMIT) or validate and throw a ValidationError if
parsedLimit > MAX_LIMIT. You can implement this by adding a MAX_LIMIT constant
and applying the check/Clamp specifically for the 'limit' field in
parseSupplyPointFilters (or extend parsePositiveInt to accept an optional max
parameter) so DEFAULT_LIMIT and other ints remain unchanged.

In `@scripts/map_dev_server.js`:
- Around line 70-79: Wrap the sync stat check in a try/catch around
fs.statSync(filePath) (after the existsSync check) and treat exceptions as a 404
fallback; when creating the read stream with fs.createReadStream(filePath)
attach 'open' and 'error' handlers so that on 'error' you write a 500 (or 404 if
ENOENT) via response.writeHead/response.end and on 'open' you pipe the stream to
response; ensure MIME_TYPES/path.extname logic remains and avoid throwing from
the request handler by handling all fs/stat/stream errors (references: filePath,
fs.existsSync, fs.statSync, fs.createReadStream, MIME_TYPES, path.extname,
response.writeHead, response.end).

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 88930e06-b999-4e90-9aa6-c203a25e1fdf

📥 Commits

Reviewing files that changed from the base of the PR and between 8cfbfcb and 071df66.

📒 Files selected for processing (6)
  • frontend/app.js
  • frontend/gpx.js
  • frontend/route_math.js
  • lib/map_data.js
  • scripts/export_map_db.js
  • scripts/map_dev_server.js
✅ Files skipped from review due to trivial changes (1)
  • frontend/gpx.js
🚧 Files skipped from review as they are similar to previous changes (2)
  • frontend/route_math.js
  • scripts/export_map_db.js

@ochanuco ochanuco merged commit 8d4e2d9 into main Mar 19, 2026
3 checks passed
@ochanuco ochanuco deleted the pr/supply-point-map-mvp branch March 19, 2026 15:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add supply point map MVP using local SQLite export and OpenLayers

1 participant