Skip to content

feat: stable random ports for service instances #340

Open
moizpgedge wants to merge 1 commit intomainfrom
feat/PLAT-510/Use-random-port-allocator
Open

feat: stable random ports for service instances #340
moizpgedge wants to merge 1 commit intomainfrom
feat/PLAT-510/Use-random-port-allocator

Conversation

@moizpgedge
Copy link
Copy Markdown
Contributor

@moizpgedge moizpgedge commented Apr 13, 2026

Adopt the random port allocation mechanism for supporting services (MCP, PostgREST, RAG), mirroring the existing instance port behaviour.

When a service spec sets port: 0, the Control Plane allocates a stable random port from the configured range, persists it in etcd, and reuses it on every subsequent re-plan. The port is released when the service is removed or the database is deleted.

Changes:

  • Add ServiceInstanceSpecStore to persist ServiceInstanceSpec in etcd under service_instance_specs/{database_id}/{service_instance_id}
  • Add CopyPortFrom to ServiceInstanceSpec using the existing reconcilePort helper
  • Add ReconcileServiceInstanceSpec and DeleteServiceInstanceSpec to database.Service, following the same pattern as ReconcileInstanceSpec / DeleteInstanceSpec
  • Call ReconcileServiceInstanceSpec in the GenerateServiceInstanceResources activity before generating resources, so allocation runs on the correct host
  • Update DeleteDatabase to release service instance ports and include ServiceInstanceSpec.DeleteByDatabaseID in the deletion transaction
  • Update DeleteServiceInstance to call DeleteServiceInstanceSpec

PLAT-510

Summary

Adopts the random port allocation mechanism (introduced in #290) for supporting services (MCP, PostgREST, RAG). When a service spec sets port: 0, the Control Plane allocates a stable random port from the configured range, persists it in etcd, and reuses it on every subsequent re-plan.

Testing

  • go test ./server/internal/database/... ./server/internal/workflows/... — all pass
  • Manual verification in compose-dev environment: created a database, added an MCP service with port: 0, confirmed a stable random port was allocated (host_port: 12497) and the Docker service ran 1/1
  • Re-planned with port: 0 and confirmed the same port was retained

Checklist

  • Tests added or updated (unit and/or e2e, as needed)
  • Documentation updated (if needed)
  • Issue is linked (branch name or URL in PR description)
  • Changelog entry added for user-facing behavior changes
  • Breaking changes (if any) are clearly called out in the PR description

@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 13, 2026

Warning

Rate limit exceeded

@moizpgedge has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 51 minutes and 45 seconds before requesting another review.

Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 51 minutes and 45 seconds.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: c582148c-05c8-40ef-9c16-b5ce272dae41

📥 Commits

Reviewing files that changed from the base of the PR and between b5d3f35 and 8e34f68.

📒 Files selected for processing (5)
  • server/internal/database/service.go
  • server/internal/database/service_instance.go
  • server/internal/database/service_instance_spec_store.go
  • server/internal/database/store.go
  • server/internal/workflows/activities/generate_service_instance_resources.go
📝 Walkthrough

Walkthrough

Service instance specification management functionality is introduced with etcd-backed storage, port lifecycle handling, and reconciliation logic. Changes include a new store, service methods for spec operations, and integration of reconciliation in workflows.

Changes

Cohort / File(s) Summary
Service Instance Spec Store
server/internal/database/service_instance_spec_store.go
New etcd-backed store for ServiceInstanceSpec with methods for hierarchical key management, retrieval by key/database, updates, and deletion operations.
Service Layer Updates
server/internal/database/service.go
Enhanced DeleteDatabase and DeleteServiceInstance to manage service instance specs; added ReconcileServiceInstanceSpec for port allocation and spec persistence; added DeleteServiceInstanceSpec for port release and spec cleanup.
Service Instance Model & Store Initialization
server/internal/database/service_instance.go, server/internal/database/store.go
Added CopyPortFrom method to ServiceInstanceSpec for port reconciliation; updated Store to include and initialize ServiceInstanceSpecStore field.
Workflow Activity Integration
server/internal/workflows/activities/generate_service_instance_resources.go
Integrated spec reconciliation via ReconcileServiceInstanceSpec before resource generation; returns error on reconciliation failure.

Poem

🐰 Hopping through specs with etcd so fine,
Ports allocated in a structured line,
Reconcile, release, and store with care,
Service instances dancing in the air! ✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description check ✅ Passed The PR description includes all required sections: Summary, Changes (detailed bullet list), Testing, and Checklist. The issue is linked via PLAT-510. The description comprehensively explains the feature and implementation.
Title check ✅ Passed The pull request title 'feat: stable random ports for service instances' directly and accurately summarizes the main change: implementing stable random port allocation for service instances.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/PLAT-510/Use-random-port-allocator

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codacy-production
Copy link
Copy Markdown

codacy-production bot commented Apr 13, 2026

Up to standards ✅

🟢 Issues 1 medium

Results:
1 new issue

Category Results
Complexity 1 medium

View in Codacy

🟢 Metrics 35 complexity · 4 duplication

Metric Results
Complexity 35
Duplication 4

View in Codacy

TIP This summary will be updated as you push new changes. Give us feedback

@moizpgedge moizpgedge changed the title feat: stable random ports for service instances PLAT-510 feat: stable random ports for service instances Apr 15, 2026
Adopt the random port allocation mechanism for supporting services
(MCP, PostgREST, RAG), mirroring the existing instance port behaviour
introduced in the `feat: stable random ports` commit.

When a service spec sets `port: 0`, the Control Plane allocates a
stable random port from the configured range, persists it in etcd, and
reuses it on every subsequent re-plan. The port is released when the
service is removed or the database is deleted.

Changes:
- Add `ServiceInstanceSpecStore` to persist `ServiceInstanceSpec` in
  etcd under `service_instance_specs/{database_id}/{service_instance_id}`
- Add `CopyPortFrom` to `ServiceInstanceSpec` using the existing
  `reconcilePort` helper
- Add `ReconcileServiceInstanceSpec` and `DeleteServiceInstanceSpec` to
  `database.Service`, following the same pattern as
  `ReconcileInstanceSpec` / `DeleteInstanceSpec`
- Call `ReconcileServiceInstanceSpec` in the
  `GenerateServiceInstanceResources` activity before generating
  resources, so allocation runs on the correct host
- Update `DeleteDatabase` to release service instance ports and include
  `ServiceInstanceSpec.DeleteByDatabaseID` in the deletion transaction
- Update `DeleteServiceInstance` to call `DeleteServiceInstanceSpec`

PLAT-510
@moizpgedge moizpgedge force-pushed the feat/PLAT-510/Use-random-port-allocator branch from b5d3f35 to 8e34f68 Compare April 15, 2026 13:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant