Add supply point map MVP with SQLite export and local OpenLayers UI#20
Add supply point map MVP with SQLite export and local OpenLayers UI#20
Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (4)
WalkthroughOpenLayersフロントエンド、ブラウザ内GPXパーサ、ルート距離計算ユーティリティ、BigQuery→SQLiteエクスポートスクリプト、ローカル読み取り専用APIサーバー、および関連ユニットテストを追加し、GPXルートに近い供給地点を表示するローカルMVPを実装します。 Changes
Sequence Diagram(s)sequenceDiagram
participant User as ユーザー
participant Frontend as Frontend (app.js)
participant GpxParser as GPX Parser (gpx.js)
participant RouteMath as RouteMath (route_math.js)
participant ApiServer as API Server (map_dev_server.js)
participant Sqlite as SQLite DB
User->>Frontend: GPXファイルをアップロード
Frontend->>GpxParser: parseGpxText(gpx)
GpxParser-->>Frontend: LineString GeoJSON
Frontend->>RouteMath: computeBbox() / expandBbox()
RouteMath-->>Frontend: 拡張BBox
Frontend->>ApiServer: GET /api/supply-points?bbox=...&chains=...&min_point_level=...
ApiServer->>Sqlite: パラメータ化されたSELECTを実行
Sqlite-->>ApiServer: 候補行群
ApiServer-->>Frontend: FeatureCollection (GeoJSON)
Frontend->>RouteMath: pointToRouteDistanceMeters()(各候補)
RouteMath-->>Frontend: 距離(m)
Frontend-->>User: 閾値でフィルタした供給地点を地図とリストに表示
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 7
🧹 Nitpick comments (4)
scripts/map_dev_server.js (1)
33-43: 内部エラーまで 400 にしない方が良いです。ここは入力バリデーション失敗と SQLite 実行失敗が同じ 400 になります。DB ファイル破損やスキーマ不整合までクライアント責任に見えるので、4xx と 5xx を分けた方が切り分けやすいです。
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/map_dev_server.js` around lines 33 - 43, The catch currently maps all errors to 400, which hides server-side failures; update the error handling around parseSupplyPointFilters/buildSupplyPointsQuery/database.prepare(...).all so validation errors return 400 while database/runtime errors return 500: introduce or use a specific error type/name (e.g., ValidationError) thrown by parseSupplyPointFilters (or detect known validation conditions), then in the catch check error instanceof ValidationError or error.name === 'ValidationError' and respond with 400, otherwise respond with 500 and include the error message; ensure the response content-type and HEAD handling remain the same.package.json (1)
16-17: サポート Node を明示しておきたいです。少なくとも
map:serveはscripts/map_dev_server.jsのnode:sqliteに依存するので、対応していない Node では起動直後に失敗します。package.jsonのenginesか README/CI のどこかで前提 Node を固定しておくと、再現性がかなり上がります。🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@package.json` around lines 16 - 17, 明示的な Node バージョンが無いため `map:serve`(scripts/map_dev_server.js が node:sqlite に依存)で非対応 Node だと起動直後に失敗します;package.json に `engines` を追加してサポートする Node の最低/推奨バージョン(例: "node": ">=18" など)を明記するか、README と CI 設定に同等の前提を追記して再現性を担保してください。参照箇所: package.json (キー: "scripts" の "map:serve") とスクリプトファイル scripts/map_dev_server.js(node:sqlite 依存)を更新してください。frontend/app.js (1)
132-138:escapeHtmlにシングルクォートのエスケープがありません。現在の使用箇所(ダブルクォートで囲まれた属性)では安全ですが、より防御的な実装として
'のエスケープ追加を検討してください。🛡️ シングルクォートのエスケープ追加
function escapeHtml(value) { return String(value) .replaceAll('&', '&') .replaceAll('<', '<') .replaceAll('>', '>') - .replaceAll('"', '"'); + .replaceAll('"', '"') + .replaceAll("'", '''); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app.js` around lines 132 - 138, The escapeHtml function is missing escaping for single quotes which can lead to attribute injection in some contexts; update the escapeHtml implementation to also replace single-quote characters (') with an HTML entity (e.g., ' ) so replaceAll("'", "'") is applied alongside the existing replacements in the escapeHtml function to ensure single quotes are safely encoded.scripts/export_map_db.js (1)
35-42:execFileSyncにタイムアウトの設定がありません。BigQueryクエリが長時間かかる場合、スクリプトが無期限にハングする可能性があります。MVPでは許容できますが、将来的には
timeoutオプションの追加を検討してください。📝 タイムアウト追加の提案
function fetchMartRows(options) { const args = buildBqArgs(options); const output = execFileSync('bq', args, { encoding: 'utf8', - maxBuffer: 256 * 1024 * 1024 + maxBuffer: 256 * 1024 * 1024, + timeout: 10 * 60 * 1000 // 10分 }); return JSON.parse(output); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/export_map_db.js` around lines 35 - 42, The fetchMartRows function uses execFileSync without a timeout, which can hang on long BigQuery queries; update fetchMartRows to accept or compute a timeout (e.g., a DEFAULT_BQ_TIMEOUT_MS constant or an options.timeout) and pass it into the execFileSync call's options object alongside encoding and maxBuffer so the child process is killed after the timeout; keep the existing maxBuffer and preserve JSON.parse(output) behavior in fetchMartRows and ensure buildBqArgs usage remains unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@frontend/app.js`:
- Around line 140-153: buildPopupHtml currently uses props.source_url directly
in an href which allows dangerous schemes (e.g., javascript: or data:) and can
lead to XSS; update buildPopupHtml to validate the URL scheme before creating
the anchor: parse props.source_url, allow only safe schemes such as http and
https (or an explicit allowlist), and only set the href when the scheme is
allowed; for disallowed or invalid URLs, render a safe fallback (e.g., '-' or
the escaped URL as plain text) and continue to use escapeHtml for all inserted
values to ensure output is escaped.
In `@frontend/gpx.js`:
- Around line 8-20: parseCoordinateTokens currently uses POINT_RE which only
matches when lat="..." appears before lon="..." and only double quotes; this
misses valid GPX points. Replace the regex approach inside
parseCoordinateTokens: parse gpxText with a proper XML parser (e.g., DOMParser)
or a robust pattern that finds each <trkpt>/<rtept> element and reads its lat
and lon attributes (accepting single or double quotes and any attribute order),
then push [lng, lat] into coords only when both attributes are present and
numeric; update references to POINT_RE and gpxText accordingly and remove
reliance on attribute ordering.
In `@lib/map_data.js`:
- Around line 127-147: The buildBqSelectSql function currently interpolates the
raw project parameter into the FROM clause without validation; add a BigQuery
project ID validator (e.g., validateProjectId) that enforces allowed characters
and format (letters, digits, hyphens, proper length/position rules) and call it
at the start of buildBqSelectSql to produce a safe validatedProject variable;
then use validatedProject instead of project in the FROM template and throw or
log a clear error if validation fails so untrusted input cannot inject invalid
project identifiers.
In `@README.md`:
- Around line 156-158: The README's API parameter names don't match the
implementation: update the GET /api/supply-points documentation to list the
actual parameter names used by the server—replace `chain` with `chains` and
replace `min_point_level` with `limit` (keep `bbox` if still supported) so the
docs read something like bbox / chains / limit to match the implementation.
In `@scripts/export_map_db.js`:
- Around line 1-13: The code imports DatabaseSync from 'node:sqlite' in
export_map_db.js (const { DatabaseSync } = require('node:sqlite')) and
tests/map_data.test.js also rely on this; because 'node:sqlite' requires Node.js
>=22.5.0 you must make the project and CI consistent: either (A) update the
GitHub Actions matrix to only test Node >=22.5.0, or (B) set package.json
"engines": {"node":" >=22.5.0"} to declare the requirement, or (C) replace the
'node:sqlite' usage (DatabaseSync and related DB operations in export_map_db.js
and any helpers in map_data) with a Node-20-compatible sqlite library; pick one
approach and apply it consistently to both export_map_db.js (remove/replace
DatabaseSync import if choosing C) and tests/map_data.test.js so CI and runtime
use a compatible Node/sqilte combination.
In `@scripts/map_dev_server.js`:
- Around line 79-80: The code currently constructs a URL using the
client-supplied request.headers.host which can be malicious and cause new
URL(...) to throw; update the server request handler so it uses a fixed base
(e.g. 'http://localhost') instead of request.headers.host when creating
requestUrl, wrap the URL parsing in a try/catch (the callback that creates
requestUrl) and respond with a 400 Bad Request if parsing fails, and ensure any
references to requestUrl (used later in the handler) continue to work with the
safely parsed URL.
In `@tests/map_data.test.js`:
- Line 3: The test imports node:sqlite (const { DatabaseSync } =
require('node:sqlite')) which is only available in Node >=22.5 and causes
ERR_UNKNOWN_BUILTIN_MODULE on older CI nodes; either update the CI Node.js
matrix to remove Node 20 and ensure all runners use >=22.5, or replace this
built-in usage by switching to a compatible npm driver (e.g., install and
require better-sqlite3 and update tests that use DatabaseSync to use the
better-sqlite3 API), making sure to update any setup/teardown in tests that
reference DatabaseSync.
---
Nitpick comments:
In `@frontend/app.js`:
- Around line 132-138: The escapeHtml function is missing escaping for single
quotes which can lead to attribute injection in some contexts; update the
escapeHtml implementation to also replace single-quote characters (') with an
HTML entity (e.g., ' ) so replaceAll("'", "'") is applied alongside the
existing replacements in the escapeHtml function to ensure single quotes are
safely encoded.
In `@package.json`:
- Around line 16-17: 明示的な Node バージョンが無いため `map:serve`(scripts/map_dev_server.js
が node:sqlite に依存)で非対応 Node だと起動直後に失敗します;package.json に `engines` を追加してサポートする
Node の最低/推奨バージョン(例: "node": ">=18" など)を明記するか、README と CI
設定に同等の前提を追記して再現性を担保してください。参照箇所: package.json (キー: "scripts" の "map:serve")
とスクリプトファイル scripts/map_dev_server.js(node:sqlite 依存)を更新してください。
In `@scripts/export_map_db.js`:
- Around line 35-42: The fetchMartRows function uses execFileSync without a
timeout, which can hang on long BigQuery queries; update fetchMartRows to accept
or compute a timeout (e.g., a DEFAULT_BQ_TIMEOUT_MS constant or an
options.timeout) and pass it into the execFileSync call's options object
alongside encoding and maxBuffer so the child process is killed after the
timeout; keep the existing maxBuffer and preserve JSON.parse(output) behavior in
fetchMartRows and ensure buildBqArgs usage remains unchanged.
In `@scripts/map_dev_server.js`:
- Around line 33-43: The catch currently maps all errors to 400, which hides
server-side failures; update the error handling around
parseSupplyPointFilters/buildSupplyPointsQuery/database.prepare(...).all so
validation errors return 400 while database/runtime errors return 500: introduce
or use a specific error type/name (e.g., ValidationError) thrown by
parseSupplyPointFilters (or detect known validation conditions), then in the
catch check error instanceof ValidationError or error.name === 'ValidationError'
and respond with 400, otherwise respond with 500 and include the error message;
ensure the response content-type and HEAD handling remain the same.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: a42ba3b3-cb96-4918-b026-bd5a5d820fb5
📒 Files selected for processing (13)
.gitignoreREADME.mdfrontend/app.jsfrontend/gpx.jsfrontend/index.htmlfrontend/route_math.jsfrontend/styles.csslib/map_data.jspackage.jsonscripts/export_map_db.jsscripts/map_dev_server.jstests/map_data.test.jstests/route_math.test.js
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@frontend/app.js`:
- Around line 269-272: The frontend currently hardcodes
params.set('limit','10000') and fetches `${API_BASE}/supply-points?...` which,
because lib/map_data.js orders results with "ORDER BY chain, name,
supply_point_id LIMIT :limit", causes nearby points beyond the first 10,000 to
be dropped; change the client to detect when the server returned exactly the
limit and then page or split the bbox and re-request (e.g., add offset/limit
query params or iteratively subdivide bbox and merge results) until a response
returns fewer than the limit (or use a server-provided overflow
flag/total_count), and update the code around params/response handling in
frontend/app.js to loop/fetch additional pages or sub-bboxes rather than relying
on a single 10,000-row request.
- Around line 259-268: selectedChains() が空配列のときに params
に何もセットしていないためバックエンド側で「全チェーン」を意味してしまいます。selectedChains() の結果をチェックして chains.length
=== 0 の場合は params.set('chains', '') のように明示的に空の chains
パラメータを追加して、フロント側から「チェーン無し(0件)」を送るようにしてください(対象箇所:selectedChains(), params /
URLSearchParams の扱い)。
In `@lib/map_data.js`:
- Around line 270-289: The parser allows arbitrarily large limit values,
enabling heavy requests; update parseSupplyPointFilters to enforce a sane
maximum for limit by checking the parsed value from parsePositiveInt (or add an
upper-bound check immediately after) and reject or clamp values above a defined
MAX_LIMIT constant (e.g., compare against DEFAULT_LIMIT and a new MAX_LIMIT);
reference parsePositiveInt, parseSupplyPointFilters, DEFAULT_LIMIT and
introduce/use MAX_LIMIT so requests like ?limit=1000000000 are refused or capped
before the data is fetched and serialized.
In `@scripts/map_dev_server.js`:
- Around line 72-75: The current static file serving uses
fs.createReadStream(filePath).pipe(response) without handling stream errors;
wrap the ReadStream usage returned by fs.createReadStream(filePath) and attach
'open' and 'error' handlers (on the stream object returned) to handle races
where the file vanishes after existsSync/statSync; on 'open' pipe to response
and on 'error' write an appropriate status (404 for ENOENT, 500 otherwise) and
end the response, and ensure you do not call response.writeHead twice (use the
MIME_TYPES lookup for content-type only after the 'open' or set a safe default
header before piping). Reference symbols: createReadStream, filePath,
MIME_TYPES, response.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 61788140-8c84-4fc6-b75e-02b79c67748a
📒 Files selected for processing (10)
.github/workflows/unit-tests.ymlREADME.mdfrontend/app.jsfrontend/gpx.jslib/map_data.jspackage.jsonscripts/export_map_db.jsscripts/map_dev_server.jstests/map_data.test.jstests/route_math.test.js
✅ Files skipped from review due to trivial changes (2)
- .github/workflows/unit-tests.yml
- README.md
🚧 Files skipped from review as they are similar to previous changes (4)
- package.json
- tests/route_math.test.js
- frontend/gpx.js
- scripts/export_map_db.js
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
♻️ Duplicate comments (4)
scripts/map_dev_server.js (1)
70-79:⚠️ Potential issue | 🟠 Major静的ファイルの TOCTOU / stream error を処理できていません。
existsSync()/statSync()の直後にファイルが消えるとstatSync()例外、GET 側はcreateReadStream()のerrorでもそのまま request handler から落ちます。フロント資産を編集中でも起きるので、statSyncをtry/catchで包みつつ stream のopen/errorを拾って 404/500 にフォールバックしてください。Also applies to: 101-110
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/map_dev_server.js` around lines 70 - 79, Wrap the sync stat check in a try/catch around fs.statSync(filePath) (after the existsSync check) and treat exceptions as a 404 fallback; when creating the read stream with fs.createReadStream(filePath) attach 'open' and 'error' handlers so that on 'error' you write a 500 (or 404 if ENOENT) via response.writeHead/response.end and on 'open' you pipe the stream to response; ensure MIME_TYPES/path.extname logic remains and avoid throwing from the request handler by handling all fs/stat/stream errors (references: filePath, fs.existsSync, fs.statSync, fs.createReadStream, MIME_TYPES, path.extname, response.writeHead, response.end).frontend/app.js (2)
273-283:⚠️ Potential issue | 🟠 Major全チェーンを外すと 0 件ではなく全件検索になります。
selectedChains()が空のときchainsを送らないため、API 側ではチェーン条件が消えて全件取得になります。UI の「全部オフ」はここで 0 件扱いにしてください。修正イメージ
async function fetchCandidatePoints(distanceMeters) { const chains = selectedChains(); + if (chains.length === 0) { + return { type: 'FeatureCollection', features: [] }; + } const minPointLevel = Number(elements.minPointLevel.value) || 8; const bbox = expandedBboxForQuery(distanceMeters); const params = new URLSearchParams(); if (bbox) { params.set('bbox', bbox.join(',')); } - if (chains.length > 0) { - params.set('chains', chains.join(',')); - } + params.set('chains', chains.join(','));🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app.js` around lines 273 - 283, selectedChains() being empty currently omits the 'chains' query param so the API treats it as "no filter" and returns all results; instead we must send an explicit value that indicates "no chains selected". In fetchCandidatePoints, always set params.set('chains', ...) — use chains.join(',') when chains.length>0 and set params.set('chains','') (or a documented sentinel like 'NONE' if the backend expects it) when chains.length===0 so the server can interpret "全部オフ" as zero results; update the logic around selectedChains and params in fetchCandidatePoints accordingly.
284-291:⚠️ Potential issue | 🟠 Major固定
limit=10000の単発 fetch だと近い地点を取りこぼします。サーバー側 (
lib/map_data.js) はORDER BY chain, name, supply_point_id LIMIT :limitなので、bbox 内が 10,000 件を超えると client-side 距離判定に回るのは先頭 10,000 件だけです。features.length === limitを overflow と見なしてページングか bbox 分割を入れないと、長い GPX / 広い閾値で欠落します。🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@frontend/app.js` around lines 284 - 291, The client uses a fixed params.set('limit', "10000") and does a single fetch (`fetch(`${API_BASE}/supply-points?${params.toString()}`)`), which drops points when the bbox contains >limit results; change the fetch logic in frontend/app.js around that call to detect overflow (when returned features.length === limit) and handle it by either implementing paging or recursively subdividing the bbox and issuing additional fetches until each sub-bbox returns <limit; replace the hardcoded '10000' with a LIMIT constant and add re-fetch logic that merges results before distance-based filtering so no nearby points are lost (operate on function/scope that calls the fetch and merges results).lib/map_data.js (1)
282-303:⚠️ Potential issue | 🟠 MajorAPI の
limitに上限がありません。
?limit=1000000000のような値がそのまま通るので、database.prepare(...).all()とJSON.stringify()でローカルでも簡単に重くなります。parseSupplyPointFilters()側で sane max を拒否か clamp してください。修正イメージ
const DEFAULT_MIN_POINT_LEVEL = 8; const DEFAULT_LIMIT = 5000; +const MAX_LIMIT = 10000; ... -function parsePositiveInt(value, label, fallback) { +function parsePositiveInt(value, label, fallback, max = Number.MAX_SAFE_INTEGER) { if (value == null || value === '') return fallback; const parsed = Number(value); - if (!Number.isInteger(parsed) || parsed <= 0) { - throw new ValidationError(`${label} must be a positive integer`); + if (!Number.isInteger(parsed) || parsed <= 0 || parsed > max) { + throw new ValidationError(`${label} must be a positive integer <= ${max}`); } return parsed; } ... - limit: parsePositiveInt(searchParams.get('limit'), 'limit', DEFAULT_LIMIT) + limit: parsePositiveInt(searchParams.get('limit'), 'limit', DEFAULT_LIMIT, MAX_LIMIT)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@lib/map_data.js` around lines 282 - 303, The limit query currently has no upper bound and can cause heavy DB/JSON work; add a sane maximum (e.g., MAX_LIMIT) and enforce it when parsing the limit in parseSupplyPointFilters: either clamp the parsed value to Math.min(parsedLimit, MAX_LIMIT) or validate and throw a ValidationError if parsedLimit > MAX_LIMIT. You can implement this by adding a MAX_LIMIT constant and applying the check/Clamp specifically for the 'limit' field in parseSupplyPointFilters (or extend parsePositiveInt to accept an optional max parameter) so DEFAULT_LIMIT and other ints remain unchanged.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@frontend/app.js`:
- Around line 273-283: selectedChains() being empty currently omits the 'chains'
query param so the API treats it as "no filter" and returns all results; instead
we must send an explicit value that indicates "no chains selected". In
fetchCandidatePoints, always set params.set('chains', ...) — use
chains.join(',') when chains.length>0 and set params.set('chains','') (or a
documented sentinel like 'NONE' if the backend expects it) when
chains.length===0 so the server can interpret "全部オフ" as zero results; update the
logic around selectedChains and params in fetchCandidatePoints accordingly.
- Around line 284-291: The client uses a fixed params.set('limit', "10000") and
does a single fetch (`fetch(`${API_BASE}/supply-points?${params.toString()}`)`),
which drops points when the bbox contains >limit results; change the fetch logic
in frontend/app.js around that call to detect overflow (when returned
features.length === limit) and handle it by either implementing paging or
recursively subdividing the bbox and issuing additional fetches until each
sub-bbox returns <limit; replace the hardcoded '10000' with a LIMIT constant and
add re-fetch logic that merges results before distance-based filtering so no
nearby points are lost (operate on function/scope that calls the fetch and
merges results).
In `@lib/map_data.js`:
- Around line 282-303: The limit query currently has no upper bound and can
cause heavy DB/JSON work; add a sane maximum (e.g., MAX_LIMIT) and enforce it
when parsing the limit in parseSupplyPointFilters: either clamp the parsed value
to Math.min(parsedLimit, MAX_LIMIT) or validate and throw a ValidationError if
parsedLimit > MAX_LIMIT. You can implement this by adding a MAX_LIMIT constant
and applying the check/Clamp specifically for the 'limit' field in
parseSupplyPointFilters (or extend parsePositiveInt to accept an optional max
parameter) so DEFAULT_LIMIT and other ints remain unchanged.
In `@scripts/map_dev_server.js`:
- Around line 70-79: Wrap the sync stat check in a try/catch around
fs.statSync(filePath) (after the existsSync check) and treat exceptions as a 404
fallback; when creating the read stream with fs.createReadStream(filePath)
attach 'open' and 'error' handlers so that on 'error' you write a 500 (or 404 if
ENOENT) via response.writeHead/response.end and on 'open' you pipe the stream to
response; ensure MIME_TYPES/path.extname logic remains and avoid throwing from
the request handler by handling all fs/stat/stream errors (references: filePath,
fs.existsSync, fs.statSync, fs.createReadStream, MIME_TYPES, path.extname,
response.writeHead, response.end).
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 88930e06-b999-4e90-9aa6-c203a25e1fdf
📒 Files selected for processing (6)
frontend/app.jsfrontend/gpx.jsfrontend/route_math.jslib/map_data.jsscripts/export_map_db.jsscripts/map_dev_server.js
✅ Files skipped from review due to trivial changes (1)
- frontend/gpx.js
🚧 Files skipped from review as they are similar to previous changes (2)
- frontend/route_math.js
- scripts/export_map_db.js
Summary
mart.rideoasis_supply_pointswith an OpenLayers frontendTesting
chanukott.rideoasis_mart.rideoasis_supply_points/api/supply-pointsreturns GeoJSON from local SQLiteCloses #19
Summary by CodeRabbit
新機能
ドキュメント
テスト
雑務