First Orion, a telecom software company, implemented Amazon Q to address the challenge of siloed operational data across multiple services. They created a centralized solution that allows cloud operators to interact with various data sources (S3, web content, Confluence) and service platforms (ServiceNow, Jira, Zendesk) through natural language queries. The solution not only provides information access but also enables automated ticket creation and management, significantly streamlining their cloud operations workflow.
First Orion is a data company specializing in telecom software with a focus on making it safe to answer phone calls again (caller ID and spam protection services). This case study, presented as an AWS “This is My Architecture” video, showcases how the company leveraged Amazon Q to address a common pain point in cloud operations: the time-consuming process of gathering information from disparate data sources during troubleshooting.
The core problem that First Orion faced was that cloud operations personnel were spending significant time chasing down details across multiple siloed services rather than actually working on resolving issues. Data was fragmented across various platforms including documentation systems, configuration management databases, ticketing systems, and web resources. This friction slowed down incident response and troubleshooting activities.
First Orion’s solution centers around Amazon Q as a hub that connects to multiple data sources, allowing users to interact with their data through natural language conversations. The architecture demonstrates a practical implementation of LLM-powered data access for operational use cases.
The system uses AWS Identity Center as the authentication mechanism. Amazon Q integrates with Identity Center to provide a web experience where users can access a Q application. This integration means that cloud operators authenticate through their normal Identity Center credentials and then interact with the Q app directly. The tight integration with AWS’s identity services suggests that access controls and authorization are managed through existing enterprise identity infrastructure.
Amazon Q in this architecture acts as what the presenter describes as “the hub in a wheel” with spokes connecting to various data sources:
Amazon S3: Used to store documents such as PDFs that Q can index and query. This allows operational documentation, runbooks, and other static content to be made searchable through natural language.
Web Crawlers: Q is configured to crawl specific domains and follow links to a configurable depth (the example mentions going two, three, or five levels deep). This capability allows Q to ingest external documentation, vendor resources, or any web-accessible content that operators might need during troubleshooting.
Confluence: A common enterprise knowledge base and documentation platform. Rather than requiring operators to manually search and read through knowledge articles and blogs in Confluence, Q indexes this content and can surface relevant information in response to natural language queries. As the presenter notes, “Q reads it for you.”
AWS Config via ServiceNow CMDB: This is an interesting integration pattern. AWS Config takes snapshots of the current state of AWS resources. First Orion uses ServiceNow as their Configuration Management Database (CMDB), and there’s a connector between ServiceNow and AWS. This means that when users query their CMDB through Q, they’re effectively getting access to AWS Config data about their infrastructure state. This provides operators with real-time visibility into their AWS environment’s configuration without needing to navigate multiple consoles.
A key differentiator in this implementation is the use of Amazon Q plugins to enable not just read operations but also write operations to external systems. The presenter specifically mentions:
Jira Integration: Operators can ask Q to create a Jira ticket directly from the conversation. When they do so, a window pops up for them to fill in the necessary details, they submit, and a ticket is created in Jira. They receive the normal confirmations including a ticket creation response and follow-up email with the ticket details.
Zendesk and Salesforce: These are mentioned as additional systems where Q plugins could enable similar functionality.
This capability removes the friction of context-switching between systems during troubleshooting. An operator who identifies an issue can immediately create a ticket without leaving the Q interface, maintaining their focus on the problem at hand.
The case study provides insight into First Orion’s decision-making process when choosing between Amazon Q and Amazon Bedrock for this use case. The presenter notes that Bedrock would have been a “slightly bigger lift” because:
For their use case of simply wanting to “talk to my data,” Amazon Q provided a simpler path to production. The presenter emphasizes that Q provides many of the same capabilities as Bedrock, including guardrails to ensure users can only access data they’re authorized to see.
This decision point is valuable for organizations evaluating their LLMOps approach. Amazon Q represents a more managed, opinionated solution that trades some flexibility for faster time-to-value, while Bedrock offers more customization at the cost of additional implementation complexity.
The presentation mentions that Q provides guardrails similar to Bedrock, ensuring that “a person should only be able to see what they’re authorized to see.” This is a critical production concern for enterprise LLM deployments, particularly when dealing with operational data that may include sensitive configuration details, credentials, or customer information.
The solution is designed around natural language interaction. Users simply ask questions in plain English and receive answers based on the indexed content. This dramatically lowers the barrier to accessing operational knowledge and reduces the learning curve for new team members who may not know where specific information lives.
A subtle but important point raised is that the integration with data sources through Q simplifies the traditional process of having to “find the person who owns it to get connectivity.” Amazon Q’s native connectors provide a standardized way to connect to common enterprise systems without custom integration work.
While this case study presents a compelling use of Amazon Q for cloud operations, there are some limitations to consider:
Nevertheless, the architecture demonstrates a practical pattern for deploying LLMs in an operational context: using a managed service (Amazon Q) as a conversational interface to federated data sources, with the ability to take actions through plugins. This hub-and-spoke model for data access is likely applicable to many enterprise scenarios beyond cloud operations.
Jabil, a global manufacturing company with $29B in revenue and 140,000 employees, implemented Amazon Q to transform their manufacturing and supply chain operations. They deployed GenAI solutions across three key areas: shop floor operations assistance (Ask Me How), procurement intelligence (PIP), and supply chain management (V-command). The implementation helped reduce downtime, improve operator efficiency, enhance procurement decisions, and accelerate sales cycles for their supply chain services. The company established robust governance through AI and GenAI councils while ensuring responsible AI usage and clear value creation.
Snorkel developed a specialized benchmark dataset for evaluating AI agents in insurance underwriting, leveraging their expert network of Chartered Property and Casualty Underwriters (CPCUs). The benchmark simulates an AI copilot that assists junior underwriters by reasoning over proprietary knowledge, using multiple tools including databases and underwriting guidelines, and engaging in multi-turn conversations. The evaluation revealed significant performance variations across frontier models (single digits to ~80% accuracy), with notable error modes including tool use failures (36% of conversations) and hallucinations from pretrained domain knowledge, particularly from OpenAI models which hallucinated non-existent insurance products 15-45% of the time.
Swisscom, Switzerland's leading telecommunications provider, developed a Network Assistant using Amazon Bedrock to address the challenge of network engineers spending over 10% of their time manually gathering and analyzing data from multiple sources. The solution implements a multi-agent RAG architecture with specialized agents for documentation management and calculations, combined with an ETL pipeline using AWS services. The system is projected to reduce routine data retrieval and analysis time by 10%, saving approximately 200 hours per engineer annually while maintaining strict data security and sovereignty requirements for the telecommunications sector.