ZenML

Leveraging Amazon Q for Integrated Cloud Operations Data Access and Automation

First Orion 2024
View original source

First Orion, a telecom software company, implemented Amazon Q to address the challenge of siloed operational data across multiple services. They created a centralized solution that allows cloud operators to interact with various data sources (S3, web content, Confluence) and service platforms (ServiceNow, Jira, Zendesk) through natural language queries. The solution not only provides information access but also enables automated ticket creation and management, significantly streamlining their cloud operations workflow.

Industry

Telecommunications

Technologies

Overview

First Orion is a data company specializing in telecom software with a focus on making it safe to answer phone calls again (caller ID and spam protection services). This case study, presented as an AWS “This is My Architecture” video, showcases how the company leveraged Amazon Q to address a common pain point in cloud operations: the time-consuming process of gathering information from disparate data sources during troubleshooting.

The core problem that First Orion faced was that cloud operations personnel were spending significant time chasing down details across multiple siloed services rather than actually working on resolving issues. Data was fragmented across various platforms including documentation systems, configuration management databases, ticketing systems, and web resources. This friction slowed down incident response and troubleshooting activities.

Solution Architecture

First Orion’s solution centers around Amazon Q as a hub that connects to multiple data sources, allowing users to interact with their data through natural language conversations. The architecture demonstrates a practical implementation of LLM-powered data access for operational use cases.

Authentication and Access Layer

The system uses AWS Identity Center as the authentication mechanism. Amazon Q integrates with Identity Center to provide a web experience where users can access a Q application. This integration means that cloud operators authenticate through their normal Identity Center credentials and then interact with the Q app directly. The tight integration with AWS’s identity services suggests that access controls and authorization are managed through existing enterprise identity infrastructure.

Data Source Integration

Amazon Q in this architecture acts as what the presenter describes as “the hub in a wheel” with spokes connecting to various data sources:

Plugins for Actionable Operations

A key differentiator in this implementation is the use of Amazon Q plugins to enable not just read operations but also write operations to external systems. The presenter specifically mentions:

This capability removes the friction of context-switching between systems during troubleshooting. An operator who identifies an issue can immediately create a ticket without leaving the Q interface, maintaining their focus on the problem at hand.

Technical Decision: Amazon Q vs. Amazon Bedrock

The case study provides insight into First Orion’s decision-making process when choosing between Amazon Q and Amazon Bedrock for this use case. The presenter notes that Bedrock would have been a “slightly bigger lift” because:

For their use case of simply wanting to “talk to my data,” Amazon Q provided a simpler path to production. The presenter emphasizes that Q provides many of the same capabilities as Bedrock, including guardrails to ensure users can only access data they’re authorized to see.

This decision point is valuable for organizations evaluating their LLMOps approach. Amazon Q represents a more managed, opinionated solution that trades some flexibility for faster time-to-value, while Bedrock offers more customization at the cost of additional implementation complexity.

Production Considerations

Guardrails and Access Control

The presentation mentions that Q provides guardrails similar to Bedrock, ensuring that “a person should only be able to see what they’re authorized to see.” This is a critical production concern for enterprise LLM deployments, particularly when dealing with operational data that may include sensitive configuration details, credentials, or customer information.

User Experience

The solution is designed around natural language interaction. Users simply ask questions in plain English and receive answers based on the indexed content. This dramatically lowers the barrier to accessing operational knowledge and reduces the learning curve for new team members who may not know where specific information lives.

Integration Simplification

A subtle but important point raised is that the integration with data sources through Q simplifies the traditional process of having to “find the person who owns it to get connectivity.” Amazon Q’s native connectors provide a standardized way to connect to common enterprise systems without custom integration work.

Critical Assessment

While this case study presents a compelling use of Amazon Q for cloud operations, there are some limitations to consider:

Nevertheless, the architecture demonstrates a practical pattern for deploying LLMs in an operational context: using a managed service (Amazon Q) as a conversational interface to federated data sources, with the ability to take actions through plugins. This hub-and-spoke model for data access is likely applicable to many enterprise scenarios beyond cloud operations.

More Like This

GenAI Transformation of Manufacturing and Supply Chain Operations

Jabil 2024

Jabil, a global manufacturing company with $29B in revenue and 140,000 employees, implemented Amazon Q to transform their manufacturing and supply chain operations. They deployed GenAI solutions across three key areas: shop floor operations assistance (Ask Me How), procurement intelligence (PIP), and supply chain management (V-command). The implementation helped reduce downtime, improve operator efficiency, enhance procurement decisions, and accelerate sales cycles for their supply chain services. The company established robust governance through AI and GenAI councils while ensuring responsible AI usage and clear value creation.

question_answering data_analysis data_integration +17

Agentic AI Copilot for Insurance Underwriting with Multi-Tool Integration

Snorkel 2025

Snorkel developed a specialized benchmark dataset for evaluating AI agents in insurance underwriting, leveraging their expert network of Chartered Property and Casualty Underwriters (CPCUs). The benchmark simulates an AI copilot that assists junior underwriters by reasoning over proprietary knowledge, using multiple tools including databases and underwriting guidelines, and engaging in multi-turn conversations. The evaluation revealed significant performance variations across frontier models (single digits to ~80% accuracy), with notable error modes including tool use failures (36% of conversations) and hallucinations from pretrained domain knowledge, particularly from OpenAI models which hallucinated non-existent insurance products 15-45% of the time.

healthcare fraud_detection customer_support +90

AI-Powered Network Operations Assistant with Multi-Agent RAG Architecture

Swisscom 2025

Swisscom, Switzerland's leading telecommunications provider, developed a Network Assistant using Amazon Bedrock to address the challenge of network engineers spending over 10% of their time manually gathering and analyzing data from multiple sources. The solution implements a multi-agent RAG architecture with specialized agents for documentation management and calculations, combined with an ETL pipeline using AWS services. The system is projected to reduce routine data retrieval and analysis time by 10%, saving approximately 200 hours per engineer annually while maintaining strict data security and sovereignty requirements for the telecommunications sector.

customer_support classification data_analysis +35