Operations | Monitoring | ITSM | DevOps | Cloud

February 2024

Step-by-step Guide to Monitor Riak Using Telegraf and MetricFire

Monitoring your databases is essential for maintaining performance, reliability, security, and compliance of your infrastructure. It allows you to stay ahead of potential issues, optimize resource utilization, and ensure a smooth and efficient operation of your database system. Effective monitoring of Riak involves collecting, analyzing, and acting on a variety of metrics and logs.

5 things to look for in a database monitoring tool

Databases are the backbone of enterprise-level organizations, facilitating efficient data management, supporting critical business processes, and providing a foundation for secure innovation and growth. Effective database monitoring is critical for maintaining a high level of performance, security, and reliability in database systems, particularly in enterprise-level environments where large volumes of data and critical business operations are involved.

The real origins of the Agile Manifesto

In February 2001, 17 people met at the Snowbird ski resort in Utah. They were the leading exponents of Extreme Programming, Scrum, and Adaptive Software Development, and they were seeking a set of compatible values based on trust, respect and collaboration. They wanted to make software development easier. And they found it in the form of a manifesto. Their only concern was that the term describing the manifesto came from a ‘Brit’ and they weren’t sure how to pronounce it.

Why organisations across Australia should embrace Test Data Management

In an era marked by the convergence of Big Data, hybrid cloud use, and the rise of machine learning and artificial intelligence, organisations across the Asia-Pacific (APAC) region find themselves at a critical juncture. The ability to collect, manage, and leverage data effectively is now a determining factor for competitive advantage – and will become more competitive.

From Chaos to Clarity Troubleshooting Postgres

I have always had a special fondness for Postgres, particularly captivated by its JSONB capabilities, which is at the core of what I love as it stands as a testament to its adaptability. This functionality exemplifies Postgres’s ability to adapt to modern apps’ rapidly changing needs to support structured and unstructured data to coexist seamlessly within the same database environment. The capabilities of PostgreSQL are not limited to its ability to manage relational and non-relational data.

The Data Table Webcast Series with Kevin Kline

The Data Table is a webcast series tailored for data and database professionals like you, hosted by SolarWinds own tech evangelist, Kevin Kline. Each session gives database professionals a seat at the table to discuss the latest news, best practices, and strategies in the Data and Analytics (D&A) space. It's not just about theory, it's an opportunity to connect with and learn from Kevin and other IT industry experts about the topics IT pros really care about.

How Time Series Databases and Data Lakes Work Together

In the fast-paced world of software engineering, efficient data management is a cornerstone of success. Imagine you’re working with streams of data that not only require rapid analysis but also need to store that data for long-term insights. This is where the powerful duo of time series databases (TSDBs) and data lakes can help.

How to speed up MySQL and PostgreSQL queries for FREE

How can I speed up a SQL query? This video showcases how to optimize a SQL query on MySQL or PostgreSQL for free using AI and a tool called EverSQL by Aiven. In the example shown, the query performance on a MySQL database went from 20 seconds to 0.5 seconds only by pasting the SQL and additional metadata in the EverSQL by Aiven UI and applying the indexes and SQL rewrite suggestions. Check out these resources to learn more:(links to any tools or resources you used in the video, our docs/trial if appropriate)

NOSQL vs SQL. Key differences and when to choose each

Until recently, the default model for application development was SQL. However, in recent years NoSQL has become a popular alternative. The wide variety of data that is stored today and the workload that servers must support force developers to consider other more flexible and scalable options. NoSQL databases provide agile development and ease of adapting to changes. Even so, they cannot be considered as a replacement for SQL nor are they the most successful choice for all types of projects.

Navigate memory management challenges in MongoDB with Site24x7

Effective memory management is crucial for optimal MongoDB performance and helps ensure seamless database operations and user experience. Allocating enough memory lets the database store frequently used data and indexes in RAM and cut down on disk I/O operations. This boosts query response times and system responsiveness. Poor memory management can cause delays in retrieving data from disk, leading to performance degradation.

Debugging and Decoding MongoDB with OpenTelemetry

MongoDB’s flexibility and document-oriented nature have always stood out to me as its most compelling features, setting it apart from the strict schema constraints of traditional relational databases. This adaptability is a boon for application development, allowing for more dynamic data interactions that mirror real-world information complexities and freeing table schemas’ constraints.

Cloud Migration Challenges: Overcoming Obstacles - TJay Belt | Redgate

TJay Belt, Director of Data at Nerd United, shares his thoughts on the obstacles you face when migrating to the cloud. Cloud adoption has been steadily on the rise for a number of years, driven by benefits such as scalability, accessibility and flexibility, security and cost.

How to reduce expenses on monitoring: Swapping in VictoriaMetrics for Prometheus

Monitoring can get expensive due to the huge quantities of data that need to be processed. In this blog post, you’ll learn the best ways to store and process monitoring metrics to reduce your costs, and how VictoriaMetrics can help. This blog post will only cover open-source solutions. VictoriaMetrics is proudly open source. You’ll get the most out of this blog post if you are familiar with Prometheus, Thanos, Mimir or VictoriaMetrics.

Charmed MongoDB: use cases for financial services

Financial institutions handle vast amounts of sensitive and confidential data, including customer information, transaction details, and regulatory compliance records. A trusted database ensures the security and privacy of this sensitive information, protecting it from unauthorised access, breaches, or cyber threats. MongoDB is the ideal fit, and it’s one of the most widely used databases in the financial services industry. It provides a sturdy, adaptable and trustworthy foundation.

Augmenting Your DBA Toolkit: Harnessing the Power of Time Series Databases

Database Administrators (DBAs) rely on time series data every day, even if they don’t think of time series data as a unique data type. They rely on metrics such as CPU usage, memory utilization, and query response times to monitor and optimize databases. These metrics inherently have a time component, making them time series data. However, traditional databases aren’t specifically designed to handle the unique characteristics and workloads associated with time series data.

"As DBAs, should we be worried about our jobs because of AI?" and other burning questions

We recently launched the State of the Database Landscape 2024 survey results, with information from almost 4,000 database professionals from around the globe. A clear picture emerged from the results, suggesting that 2024 is the year that skill diversification among database professionals is imperative. There’s the need to manage multiple databases, to migrate to the cloud, to introduce continuous delivery with DevOps, and even incorporating Generative AI into the mix.

Azure Cosmos DB Pricing (2024)

Azure Cosmos DB, a global, multi-model database by Microsoft Azure, ensures globally responsive and scalable applications with low-latency, high-throughput data access. With support for diverse data models, global distribution, flexible consistency models, automatic scaling, and comprehensive SLAs, it’s crucial for modern applications requiring agility, security, and compliance.