Operations | Monitoring | ITSM | DevOps | Cloud

November 2023

ML and APM: The Role of Machine Learning in Full Lifecycle Application Performance Monitoring

The advent of Machine Learning (ML) has unlocked new possibilities in various domains, including full lifecycle Application Performance Monitoring (APM). Maintaining peak performance and seamless user experiences poses significant challenges with the diversity of modern applications. So where and how does ML and APM fit together? Traditional monitoring methods are often reactive, resolving concerns after the process already affected the application’s performance.

Paving the way for modern search workflows and generative AI apps

Elastic’s innovative investments to support an open ecosystem and a simpler developer experience In this blog, we want to share the investments that Elastic® is making to simplify your experience as you build AI applications. We know that developers have to stay nimble in today’s fast-evolving AI environment. Yet, common challenges make building generative AI applications needlessly rigid and complicated. To name just a few.

Generative AI explained

When OpenAI released ChatGPT on November 30, 2022, no one could have anticipated that the following 6 months would usher in a dizzying transformation for human society with the arrival of a new generation of artificial intelligence. Since the emergence of deep learning in the early 2010s, artificial intelligence has entered its third wave of development. The introduction of the Transformer algorithm in 2017 propelled deep learning into the era of large models.

How Generative AI Makes Observability Accessible for Everyone

We are pleased to share a sneak peek of Query Assistant, our latest innovation that bridges the world of declarative querying with Generative AI. Leveraging our large language models (LLMs), Coralogix’s Query Assistant translates your natural language request for insights into data queries. This delivers deep visibility into all your data for everyone in your organization.

Build Operational Resilience with Generative AI and Automation

For modern enterprises aiming to innovate faster, gain efficiency, and mitigate the risk of failure, operational resilience has become a key competitive differentiator. But growing complexity, noisy systems, and siloed infrastructure have created fragility in today’s IT operations, making the task of building resilient operations increasingly challenging.

Automate insights-rich incident summaries with generative AI

Does this sound familiar? The incident has just been resolved and management is putting on a lot of pressure. They want to understand what happened and why. Now. They want to make sure customers and internal stakeholders get updated about what happened and how it was resolved. ASAP. But putting together all the needed information about the why, how, when, and who, can take weeks. Still, people are calling and writing. Nonstop.

Using Honeycomb for LLM Application Development

Ever since we launched Query Assistant last June, we’ve learned a lot about working with—and improving—Large Language Models (LLMs) in production with Honeycomb. Today, we’re sharing those techniques so that you can use them to achieve better outputs from your own LLM applications. The techniques in this blog are a new Honeycomb use case. You can use them today. For free. With Honeycomb.

Sponsored Post

Building Exceptional Products: Almaden's Approach

In today's dynamic world of technology and innovation, building products that resonate with customers and stand the test of time is no easy feat. At Almaden, we've cultivated a unique, Customer-Centric Product Design, approach to product development that prioritizes the customer's perspective over mere technological prowess. In this blog post, we'll delve into the core principles that drive our product development process, emphasizing the importance of understanding objectives, agile methodologies, and the modern tools we use to bring our ideas to life.

AI Explainer: The Dirty Little Secret About ChatGPT

ChatGPT, developed by OpenAI and launched in November 2022, isn’t the only large language model that has received lots of attention lately, but it’s by far the most widely known. A previous blog post that listed a glossary of AI terms included this brief definition: You may have read over the past year that GPT-4 (the paid version of ChatGPT) has been able to pass many difficult exams. Here are just a few.

PagerDuty Copilot | Generative AI for PagerDuty Operations Cloud

Introducing PagerDuty Copilot: Your GenAI assistant for critical operations work. For scaling your teams. For sustaining customer experiences. For moving business forward – faster. Work more efficiently. Protect more revenue. Build greater operational resilience. PagerDuty Copilot is the AI assistant operations teams trust to help them manage business-impacting issues in seconds, not hours. From event to resolution, PagerDuty Copilot’s automations help you resolve issues faster, reduce risk, and control costs.

Using Amazon SageMaker orb to orchestrate model deployment across environments

This tutorial guides you on how to use the Amazon SageMaker Orb to orchestrate model deployment to endpoints across different environments. It also shows how to use the CircleCI platform to monitor and manage promotions and rollbacks. It will use an example project repository to walk you through every step, from training a new model package version to deploying your model across multiple environments.

Top tips: Four compelling use cases for AI in FinTech

Top tips is a weekly column where we highlight what’s trending in the tech world today and list out ways to explore these trends. This week, we’re examining four use cases for AI in the ever-growing FinTech sector. The FinTech sector has transformed the discussion around the financial services industry from top to bottom.

Monitor Amazon Bedrock with Datadog

Amazon Bedrock is a fully managed service that offers foundation models (FMs) built by leading AI companies, such as AI21 labs, Meta, and Amazon along with other tools for building generative AI applications. After enabling access to validation and training data stored in Amazon S3, customers can fine-tune their FMs to invoke tasks such as text generation, content creation, and chatbot Q&A—without provisioning or managing any infrastructure.

AI and the evolution of learning: Insights from Coding with Lewis

In this episode, Rob sits down with Lewis Menelaws from Coding with Lewis, a prominent social media influencer and content creator specializing in entertaining and empowering software developers. Together, they explore the evolving landscape of learning the craft, drawing comparisons between the present day and the learning experiences of 25 years ago.

Modernizing ITSM with ITIL 4: Intelligent automation

Generative AI has the world thinking about automation now more than ever before. The Information Technology Infrastructure Library (ITIL) has prioritized it from the start. ITIL has advocated for automation as a transformative tool for organizations to deliver business value, accelerate change, and reinvent service configuration management. By handling mundane tasks, automation can empower people to do more innovative and effective work.

C-suite insights on the transformative power of generative AI

Generative AI is revolutionizing the way businesses operate, from improving operational resilience to mitigating security risks and enhancing customer experiences. In a recent roundup of c-suite insights from three IT leaders — Matt Minetola, CIO, Mandy Andress, CISO, and Rick Laner, chief customer officer — we gain a comprehensive understanding of how generative AI is being used to improve business outcomes across organizations.

Sponsored Post

Workflows in Education: How To Leverage Workflow Automation

Every school encounters a wide range of administrative tasks. Workflow automation software is revolutionizing education by easing the burden of these duties. From managing admissions and scheduling classes to tracking student performance and coordinating events, the list is exhaustive. These crucial responsibilities can distract educators from their primary role: teaching. Imagine an environment free from piled-up paperwork, where schedules update themselves, and notifications arrive autonomously.This isn't a far-fetched dream.

From Cloud to AI: The Evolution of IT Infrastructure

Zenoss Chief Product Officer Trent Fitz was recently featured on the "Tech Talks Daily" podcast, hosted by Neil Hughes. Trent is a pioneer in the realm of AIOps with over two decades of experience in artificial intelligence, cloud computing and cybersecurity. His expertise in cloud computing has significantly influenced his work with AIOps, especially in managing the increasing complexity of cloud-based infrastructure.

Enhance Your Business With These Easy, Yet Effective Methods

In the dynamic world of business, staying ahead often requires continuous improvement and innovation. While some might assume that achieving significant enhancements necessitates complex strategies, many businesses have found that easy yet effective methods can lead to remarkable results. In this blog, we'll explore a range of practical approaches that can help you boost your business without breaking the bank or overextending your resources. Whether you're a small startup or a well-established enterprise, these methods are designed to empower you to enhance your business.

Generative AI & Enterprise IT: Overhyped or Radically Under Estimated?

Join Cribl’s Jackie McGuire and Ed Bailey as they discuss AI's current and future state. They will discuss the many challenges and vast promise of this promising way to increase productivity and solve problems. In addition, Jackie and Ed will also comment on SolarWinds’ response to the SEC charges alleging Solarwinds and its CISO defrauded investors by repeatedly misleading them about its cybersecurity posture. Please join us for a great conversation.

Five worthy reads: The future of tech is clean

Five worthy reads is a regular column on five noteworthy items we’ve discovered while researching trending and timeless topics. In this edition, we are exploring the emerging market for climate technology, exploring their significance, and addressing why a successful path forward lies in embracing clean, green, and planet-friendly solutions for both startups and established companies. Let’s dive right in.

New Features: AI-assisted postmortems, ilert Terraform updates, and expanded ChatOps capabilities

In incident management, staying ahead of the curve is crucial, and that's what we're doing with our latest suite of features designed to streamline your workflow and enhance your response capabilities. Furthermore, you have provided numerous excellent suggestions during this period. We value your feedback and invite you to reach out to us at support@ilert.com to share your experiences with ilert.

GenAI in customer experience: Powering the future today

ChatGPT captured our collective imagination when it burst into the mainstream last year, setting off a hype cycle that hasn’t abated. The enterprise is where generative AI (GenAI) will become more than tech’s newest shiny object. GenAI is transforming the way we work, unlocking new efficiencies, driving productivity, and creating employee and customer experiences we never could have imagined. ServiceNow is at the forefront of this transformation.

AI Explainer: What Are Neural Networks?

In a previous blog post, which was a glossary of terms related to artificial intelligence, I included this brief definition of "neural networks": Let’s go a bit deeper on that. Neural networks are a class of artificial intelligence (AI) and machine learning models inspired by the structure and functioning of the human brain. They are a subset of AI techniques that have gained significant popularity due to their ability to learn and make decisions from data.

NVIDIA H200 Power Requirements: Can Your Racks Support Them?

This week, NVIDIA unveiled what they are calling “the world’s most powerful GPU for supercharging AI and HPC workloads,” the H200 Tensor Core GPU. There is much hype around the H200 as it is the first GPU with HBM3e. The larger and faster memory will further enable generative AI, large language models, and advance scientific computing for HPC workloads. Read the NVIDIA press release.

6 Ways Technology Altered the Course of Logistics Industry Evolution

Over the years, technology has had a profound impact on almost every industry. But perhaps one of the most significant transformations has been seen in the logistics industry. The rise of new technologies like artificial intelligence, big data analytics, and automation has completely altered the way companies operate and manage their supply chains. From managing inventory to tracking shipments, technology has revolutionized every aspect of logistics.

Balancing The Benefits Of AI With Individual Privacy Rights

The rise of artificial intelligence (AI) technologies has ushered in a wave of powerful capabilities. From self-driving cars to facial recognition algorithms, AI is everywhere, providing companies and individuals with practical tools for streamlining workflows and improving lives. What's also become increasingly clear is that the expansion of AI can potentially violate individual privacy rights-a cause for concern among consumers and governments alike.

Innovate at speed with open source AI

AI is swiftly changing our lives in many invisible ways - from the movies we watch and video games we play, all the way to real-time space tracking. New tools and applications pop up all the time, but keeping up with all this innovation is difficult. Which tool chains should you use and which platforms should you build on? Canonical helps you innovate at speed with easy-to-use solutions that benefit from long-term support.

The Future of Operations: AI-powered Internet Performance Monitoring

At Catchpoint, our philosophy is that AI should not be adopted simply for the sake of AI itself. Instead, it should be embraced when it proves to be the most effective solution for addressing a particular business challenge. While the world is currently in the fervor of the oncoming AI revolution, our industry-leading IPM platform has quietly harnessed the potential of Artificial Intelligence for years.

InoNet and Canonical partner for seamless Edge AI deployment

InoNet Computer GmbH, a Eurotech Company, known for engineering and manufacturing of embedded systems and Edge AI computers, has entered into a strategic partnership with Canonical, the publisher of Ubuntu. Together they are set to deliver a robust platform for deploying IoT solutions, introducing cutting-edge Ubuntu certified computers.

AI for Software Development | Dharmesh Shah: The Future of Software, GitKon 2023

"The future opportunity is not having generative AI generate something for human consumption, it's having it generate things for computer consumption." Dharmesh breaks down what he believes is the next biggest advancement in software development (and beyond) – using AI to transform ideas from our brains.

7 Habits of Successful Generative AI Adopters

Generative AI is forecasted to have a massive impact on the economy. These headlines are driving software teams to rapidly consider how they can incorporate generative AI into their software, or risk falling behind in a sea-change of disruption. But in the froth of a disruptive technology, there’s also high risk of wasted investment and lost customer trust.

The Future of Serverless is AI WebAssembly by Matt Butcher - Navigate Europe 23

Join Matt Butcher as he explores the future of serverless computing, unveiling the power of WebAssembly and AI inferencing on Fermyon's innovative platform. Learn about the evolution from virtual machines and containers to serverless functions, understand serverless computing from a developer's perspective, and discover how Fermyon is making AI inferencing more accessible and efficient.

The Unplanned Show E20: LLM Observability w/Charity Majors & James Governor

Large language models (LLMs) are foundational to generative AI capabilities, but present new challenges from an observability perspective. Hear from observability thought leader and CTO/co-founder of Honeycomb, Charity Majors, and developer-focused analyst and co-founder of Redmonk, James Governor in this discussion about LLM observebility as more organizations are building business critical features on LLMs.

AI Explainer: What Are Reinforcement Learning 'Rewards'?

In a previous blog post, which was a glossary of terms related to artificial intelligence, I included this brief definition of "reinforcement learning": I expect this definition would prompt many to ask, "What rewards can you give a machine learning agent?" A gold star? Praise? No, the short answer is: numerical values. In reinforcement learning, rewards are crucial for training agents to make decisions that maximize their performance in a given environment.

LM Co-Pilot: Your AI Co-Pilot for the Magical Streamlining of IT and Cloud Operations

LogicMonitor’s Generative Intelligence Solution for IT Teams Cutting-edge generative technologies have revolutionized our industry, paving the way for fresh and innovative approaches to deliver interactive and actionable experiences. At LogicMonitor, we firmly believe in leveraging these generative techniques across our platform, offering a uniquely dynamic support system for various aspects of our end-user experience.

Elasticsearch and LangChain collaborate on production-ready RAG templates

For the past few months, we’ve been working closely with the LangChain team as they made progress on launching LangServe and LangChain Templates! LangChain Templates is a set of reference architectures to build production-ready generative AI applications. You can read more about the launch here.

Build and evaluate LLM-powered apps with LangChain and CircleCI

Generative AI has already shown its huge potential, but there are many applications that out-of-the-box large language model (LLM) solutions aren’t suitable for. These include enterprise-level applications like summarizing your own internal notes and answering questions about internal data and documents, as well as applications like running queries on your own data to equip the AI with known facts (reducing “hallucinations” and improving outcomes).

Quantifying the value of AI-powered observability

Organizations saw a 243% ROI and $1.2 million in savings over three years In today’s complex and distributed IT environments, traditional monitoring falls short. Legacy tools often provide limited visibility across an organization’s tech stack and often at a high cost, resulting in selective monitoring. Many companies are therefore realizing the need for true, affordable end-to-end observability, which eliminates blind spots and improves visibility across their ecosystem.

AI Explainer: What Are Generative Adversarial Networks?

I previously posted a blog that was a glossary of terms related to artificial intelligence. It included this brief definition of "generative AI": I expect for someone learning about AI, it's frustrating to read definitions of terms that include other terms you may not understand. In this case, generative adversarial networks — GANs — is probably a new term for many. This post will explain what GANs are for that reason — and also because they’re super cool.

The future with large language models (LLMs) feat. Ramprakash Ramamoorthy

Expanding on our previous topic of large language models in enterprise IT, Ramprakash Ramamoorthy, Director of AI research at ManageEngine and Zoho Corporation, takes it one step further as we dive deeper into the various functions of a business, and the normalization of LLM integration in those operations.

Top tips: 3 surprising ways generative AI can boost your data analysis

Top tips is a weekly column where we highlight what’s trending in the tech world and list ways to explore these trends. When you think about generative AI, what instinctively comes to your mind is content and image generation. But, in this week’s Top tips column, let’s look at a less-explored facet of generative AI: data analytics. There are a lot of conversations about data and its benefits.

AI Explainer: What Is Data Cleaning?

In a previous blog post, which was a glossary of terms related to artificial intelligence, I included this brief definition of "data preprocessing": It is common for people familiar with these matters to talk about not having clean data. When dealing with AI for whatever your needs are, clean data is crucial for the quality of results. Garbage in, garbage out, as they say. So, let’s dive into what it means to have clean data.

Monitor your OpenAI usage with Grafana Cloud

In the ever-changing field of artificial intelligence, OpenAI is consistently seen as a leader in innovation. Its AI models, starting with GPT-3 and now with GPT-4, are already used extensively in software development and content creation, and they’re expected to usher in entire sets of new systems in the future.

Building AI bots on your private data with an open source stack

Join Andrew Zigler, Developer Advocate at Mattermost, in this virtual presentation from AI DevWorld 2023 and learn how you can leverage open source tools to create intelligent bots that help you and your team get ahead in collaboration tasks, without compromising on the security and privacy of your data. We'll cover the ins and outs of working with AI, the importance of data privacy, and explore pathways to safely experiment with AI bots and revolutionize collaborative efforts at your organization.

The Role of Generative AI and Large Language Models in IT Operations

Artificial intelligence, particularly generative AI and large language models have changed how we approach IT operations, cybersecurity, and observability. And though we can point to measurable benefits and outcomes from applying LLMs to ITOps, there is also a lot of speculation to deal with. Phillip Gervasi, Director of Technical Evangelism at Kentik, and Christoph Pfister, Chief Product Officer at Kentik, discuss what generative AI and LLMs are, how they can be used to improve IT operations, and what the future might hold.

The Future of Machine Learning and AI Panel Discussion - Navigate Europe 23

Dive deep into the future of AI and ML with Josh Mesout, Andrew Maguire, Jason Grant, Tatiana Botskina, and Dan Lewis. In this engaging discussion, they share insights from their varied backgrounds, tackling critical topics such as generative AI, ethical AI practices, and the evolving landscape of AI regulation. Whether you're a seasoned professional or new to the field, this video provides a unique opportunity to learn from industry leaders and gain a comprehensive understanding of the challenges and opportunities in AI and ML.

Risks and rewards of generative AI for software development

Generative artificial intelligence (AI) is a form of AI that can create new, original content such as text, code, images, video, and even music. Generative AI-powered tools like GitHub’s Copilot and OpenAI’s ChatGPT have the potential to revolutionize the way you develop software, enabling you to be more efficient and creative. Used in the right way, generative AI can streamline workflows, accelerate development cycles, and unlock the potential for innovation.