Basil Faruqui, the director of solutions marketing at BMC Software, shares insights into the significance of DataOps, data orchestration, and how AI can enhance complex workflow automation for business success.

Recent Developments at BMC

BMC is experiencing an exciting phase, especially with our Control-M product line. We are assisting some of the world’s largest enterprises in automating and orchestrating business outcomes reliant on intricate workflows. A significant focus of our strategy is on DataOps, particularly in orchestration within this practice. Over the past year, we have implemented over seventy integrations with serverless and PaaS offerings across platforms like AWS, Azure, and GCP, allowing our customers to quickly incorporate modern cloud services into their Control-M orchestration patterns. Additionally, we are developing use cases based on generative AI to enhance workflow development and optimize runtime.

Emerging Trends in DataOps

The data landscape continues to see substantial investment in analytics software. Analysts estimate that spending in this sector surpassed $100 billion last year. According to Matt Turck’s annual overview of the Machine Learning, Artificial Intelligence, and Data Landscape, the field has become more crowded, with over 2,011 entries—more than 500 added since 2023. As companies recognize the need to operationalize data initiatives effectively, DataOps is becoming a foundational element, emphasizing that merely adding more engineers is no longer sufficient. The rise of generative AI will further solidify the importance of this operational model.

Key Considerations for Developing a Data Strategy

Investment in data initiatives from executives, including CEOs, CMOs, and CFOs, remains robust, aiming for transformational business outcomes rather than just incremental efficiencies. Three critical aspects must be prioritized:

  1. Alignment with Business Goals: Ensure that the data strategy is directly connected to overarching business objectives, so technology teams focus on what truly matters.
  2. Data Quality and Accessibility: High-quality data is essential; poor quality can lead to inaccurate insights. Equally important is ensuring that the right data is accessible to the right people at the right time. Democratising data access, while implementing necessary controls, empowers teams to make informed, data-driven decisions.
  3. Production Scalability: The strategy should incorporate operational readiness into data engineering practices from the outset, not just during pilot phases.

The Role of Data Orchestration in Business Strategy

Data orchestration is arguably the cornerstone of DataOps. Many organizations have data scattered across various systems, including cloud, on-premises, legacy databases, and third-party applications. Integrating and orchestrating these disparate data sources into a cohesive system is crucial for effective operations. Proper orchestration facilitates seamless data flow between systems, minimizing issues like duplication and latency while supporting timely decision-making.

Customer Challenges in Data Orchestration

Organizations often struggle to deliver data products rapidly while scaling effectively. Generative AI exemplifies this challenge, as executives are demanding swift results due to its potential to disrupt businesses that fail to harness its capabilities. As generative AI brings new practices such as prompt engineering and chaining, organizations need to find ways to integrate large language models, vector databases, and bots into existing data pipelines. A strategic approach to orchestration can help accommodate these new technologies, allowing for scalable data pipeline automation. One customer referred to Control-M as a “power strip of orchestration,” enabling them to plug in new technologies without needing a complete system overhaul.

Top Tips for Optimal Data Orchestration

While many tips could be offered, one stands out: ensuring interoperability between application and data workflows. This interoperability is essential for achieving both speed and scalability in production. For example, consider a machine learning pipeline designed to predict customers likely to switch to competitors. The data flowing into this pipeline originates from workflows running in ERP/CRM systems, alongside other applications. The success of these application workflows is often a prerequisite for triggering the corresponding data workflows. Once the model identifies at-risk customers, the next step may involve sending promotional offers, requiring interaction with the application layer again. Control-M excels in this area by helping customers manage complex dependencies between application and data layers.

Opportunities and Challenges in AI Deployment

The rapid advancement of AI, particularly generative AI, is reshaping the technologies within the data ecosystem. New models, vector databases, and automation patterns are emerging, and while these changes are not new, the pace is accelerating. From an orchestration standpoint, we see substantial opportunities with our customers, thanks to our adaptable platform that allows them to integrate new tools and practices into their existing workflows instead of starting from scratch.

Case Study: Successful AI Utilization

Domino’s Pizza serves as a prime example of effectively utilizing Control-M to manage its extensive and complex data pipelines. With over 20,000 locations globally, Domino’s oversees more than 3,000 data pipelines that source data from various origins, including internal supply chain systems, sales data, and third-party integrations. This data undergoes complex transformations before it can inform decisions regarding food quality, customer satisfaction, and operational efficiency across its franchise network.

Control-M is essential in orchestrating these data workflows, ensuring seamless integration across diverse technologies like MicroStrategy, AMQ, Apache Kafka, Confluent, GreenPlum, Couchbase, Talend, SQL Server, and Power BI. Beyond merely connecting orchestration patterns, Control-M provides end-to-end visibility of pipelines, ensuring compliance with service-level agreements (SLAs) while managing increasing data volumes. This capability enables Domino’s to generate critical reports more swiftly, deliver insights to franchisees, and effectively scale new business services.

Looking Ahead at BMC

In the coming year, our strategy for Control-M at BMC will remain focused on a few core principles:

  • We aim to position Control-M as a centralized control point for orchestration as customers adopt modern technologies, especially in the public cloud. This involves continuing to roll out new integrations with major cloud providers to facilitate orchestration across various infrastructure models, including IaaS, containers, and PaaS (serverless cloud services).
  • Recognizing that enterprise orchestration requires collaboration across engineering, operations, and business users, we plan to enhance user experience and interface design to promote seamless teamwork.
  • Specifically within DataOps, we are exploring the intersection of orchestration and data quality, prioritizing the integration of data quality practices into application and data workflows. Stay tuned for updates in this area!
Shares: