ZenML

Generative AI Assistant for Agricultural Field Trial Analysis

Agmatix 2024
View original source

Agmatix developed Leafy, a generative AI assistant powered by Amazon Bedrock, to streamline agricultural field trial analysis. The solution addresses challenges in analyzing complex trial data by enabling agronomists to query data using natural language, automatically selecting appropriate visualizations, and providing insights. Using Amazon Bedrock with Anthropic Claude, along with AWS services for data pipeline management, the system achieved 20% improved efficiency, 25% better data integrity, and tripled analysis throughput.

Industry

Other

Technologies

Overview

Agmatix is an agricultural technology (AgTech) company that specializes in data-driven solutions for the agriculture industry. Their core focus is on expediting R&D processes for seed genetics, fertilizers, and crop protection molecules (including pesticides, herbicides, fungicides, and biologicals). A significant challenge in agricultural innovation is the management and analysis of field trial data—experiments conducted to test the effectiveness of new crop varieties and agricultural chemicals in real-world conditions. Agmatix has built a platform that collects, manages, and analyzes this agricultural field trial data, and they have now integrated generative AI capabilities to enhance the analytical experience for their users.

This case study describes how Agmatix implemented a generative AI assistant called “Leafy” using Amazon Bedrock and Anthropic Claude to help agronomists and researchers analyze complex field trial data more efficiently. The solution aims to transform historically manual, time-consuming analytical processes into conversational, natural language interactions.

The Problem

Agricultural field trials generate vast amounts of complex data that present several challenges for agronomists and researchers:

Without generative AI, building analytical dashboards and gaining meaningful insights from field trials was described as complex and time-consuming, requiring significant manual effort from agronomists who could otherwise be focusing on higher-value strategic work.

The Solution Architecture

Agmatix’s technology architecture is built entirely on AWS, with a data pipeline that consists of several components working together before the generative AI layer is applied:

Data Pipeline Foundation:

Generative AI Integration: The generative AI chatbot application (Leafy) is built on three fundamental components:

When an agronomist asks Leafy a question, Agmatix’s Insights solution sends a request to Anthropic Claude on Amazon Bedrock through an API. The prompt sent to Claude consists of two elements:

Workflow Details

The workflow operates as follows when a user interacts with the system:

The data used in prompt engineering (trial results and rules) is stored in plain text and sent to the model directly. Prompt engineering is described as playing a central role in this generative AI solution, with the team following Anthropic Claude’s prompt engineering guidelines.

LLMOps Considerations

Several LLMOps-relevant aspects can be observed from this implementation:

Model Selection and Integration: Agmatix chose Amazon Bedrock as their foundation model service, specifically using Anthropic Claude. Amazon Bedrock is described as a fully managed, serverless generative AI offering that provides access to multiple high-performance foundation models through a single API. This approach allows them to potentially swap models or experiment with different options without significant infrastructure changes.

Prompt Engineering: The solution relies heavily on prompt engineering rather than fine-tuning or RAG (Retrieval Augmented Generation). The prompts combine user questions with contextual instructions and trial-specific data. This suggests a relatively straightforward prompt-based approach where the relevant data is included directly in the prompt context rather than retrieved from a vector database.

Data Strategy: The case study emphasizes that having a well-defined data strategy is the first step in developing and deploying generative AI use cases. The existing data pipeline (S3, Glue, Lambda) provides clean, transformed data that can be fed into the generative AI layer. This highlights the importance of data quality and preparation in LLMOps.

Serverless Architecture: By using Amazon Bedrock, Agmatix avoids managing their own model infrastructure. This is a common LLMOps pattern where companies leverage managed services to reduce operational overhead.

Integration with Existing Systems: The generative AI capability is integrated into an existing product (Insights), demonstrating how LLM features can be added to enhance existing applications rather than requiring entirely new systems.

Reported Results

According to Agmatix’s claims, using Amazon Bedrock on AWS for their data-driven field trials service resulted in:

It’s important to note that these metrics come from the company itself and are presented in what is essentially a promotional case study published on AWS’s blog. The specific methodology for measuring these improvements is not detailed, so these figures should be interpreted with appropriate caution.

Practical Application

A practical application mentioned is the largest open nutrient database for crop nutrition, powered by Agmatix infrastructure, where researchers can access insights from thousands of field trials. Users benefit from guided question prompts and responses facilitated by generative AI, helping them understand trends in crop nutrient uptake and removal and simplify the creation of decision support systems.

Critical Assessment

While this case study presents a compelling use case for generative AI in agricultural research, several aspects warrant careful consideration:

Despite these limitations, the case study does illustrate a legitimate and increasingly common LLMOps pattern: using foundation models through managed services to enhance existing data products with natural language interfaces, thereby reducing the complexity barrier for end users to access and analyze data.

More Like This

Natural Language Analytics Assistant Using Amazon Bedrock Agents

Skai 2025

Skai, an omnichannel advertising platform, developed Celeste, an AI agent powered by Amazon Bedrock Agents, to transform how customers access and analyze complex advertising data. The solution addresses the challenge of time-consuming manual report generation (taking days or weeks) by enabling natural language queries that automatically collect data from multiple sources, synthesize insights, and provide actionable recommendations. The implementation reduced report generation time by 50%, case study creation by 75%, and transformed weeks-long processes into minutes while maintaining enterprise-grade security and privacy for sensitive customer data.

data_analysis question_answering chatbot +24

Agentic AI Copilot for Insurance Underwriting with Multi-Tool Integration

Snorkel 2025

Snorkel developed a specialized benchmark dataset for evaluating AI agents in insurance underwriting, leveraging their expert network of Chartered Property and Casualty Underwriters (CPCUs). The benchmark simulates an AI copilot that assists junior underwriters by reasoning over proprietary knowledge, using multiple tools including databases and underwriting guidelines, and engaging in multi-turn conversations. The evaluation revealed significant performance variations across frontier models (single digits to ~80% accuracy), with notable error modes including tool use failures (36% of conversations) and hallucinations from pretrained domain knowledge, particularly from OpenAI models which hallucinated non-existent insurance products 15-45% of the time.

healthcare fraud_detection customer_support +90

AI-Powered Financial Assistant for Automated Expense Management

Brex 2025

Brex developed an AI-powered financial assistant to automate expense management workflows, addressing the pain points of manual data entry, policy compliance, and approval bottlenecks that plague traditional finance operations. Using Amazon Bedrock with Claude models, they built a comprehensive system that automatically processes expenses, generates compliant documentation, and provides real-time policy guidance. The solution achieved 75% automation of expense workflows, saving hundreds of thousands of hours monthly across customers while improving compliance rates from 70% to the mid-90s, demonstrating how LLMs can transform enterprise financial operations when properly integrated with existing business processes.

fraud_detection document_processing classification +31