Business Challenge
In Phase 1 of the Grant Generation Workbench, the core hurdle emerged from the system’s dependency on multiple biomedical data sources. Querying large, distributed, and computationally heavy repositories led to high response times. This latency not only slowed AI‑assisted needs assessment and grant drafting but also limited the team’s ability to deliver a seamless, compliant, and traceable grant‑development experience.
Merkle Solution
- Implemented a polling‑based retrieval mechanism on Databricks to manage high latency and asynchronous behavior.
- Decoupled the Chat UI and agentic layer from long‑running biomedical searches to maintain responsiveness.
- Designed user queries to trigger background search jobs and return immediate acknowledgments instead of waiting for results.
- Enabled the User Query Agent to assess whether existing data was sufficient or if deeper biomedical searches were required.
- Ran multiple Sub‑Research Agents in parallel, each responsible for summarizing, extracting, or validating specific portions of retrieved documents.
Technology
- Fast API
- Databricks
- LangGraph
- Claude Sonnet 4.5


