Showing entries 1 to 3
Displaying posts with tag: Real-time (reset)
MySQL Cluster and real-time requirements

This blog gives some background to the decisions made when designing the
storage engine NDB Cluster used in MySQL Cluster around how to support
real-time requirements (or as I sometime refer to it, predictable response
time requirements).

Requirement analysisWhen analysing the requirements for NDB Cluster based on its usage in telecom
databases two things were important. The first requirement is that we need to
be able to respond to queries within a few milliseconds (today even down to
tens of microseconds). The second requirement is that we need to do this while
at the same time supporting a mix of simple traffic queries combined with a
number of more complex queries running at the same time.

The first requirement was the main requirement that led to NDB Cluster using a
main memory storage model with durability on disk using a REDO log and
various …

[Read more]
MySQL Cluster in Environments Requiring Real-Time Analytics

MySQL Cluster is used by many different industries. MySQL Cluster thrives in the most complex data environments that demand real-time analytics such as:

  • Financial trading with fraud protection
  • Feed-streaming analysis and recommendations
  • Massive online multiplayer games
  • Communication services

To learn more about MySQL Cluster, take the MySQL Cluster training course. Below is a selection of events already on the schedule for this 3-day instructor led course:

 Location  Date  Delivery Language
 Sao Paolo, Brazil
[Read more]
Real-time streaming data aggregation

Dear Kettle users,

Most of you usually use a data integration engine to process data in a batch-oriented way.  Pentaho Data Integration (Kettle) is typically deployed to run monthly, nightly, hourly workloads.  Sometimes folks run micro-batches of work every minute or so.  However, it’s lesser known that our beloved transformation engine can also be used to stream data indefinitely (never ending) from a source to a target.  This sort of data integration is sometimes referred to as being “streaming“, “real-time“, “near real-time“, “continuous” and so on.  Typical examples of situations where you have a never-ending supply of data that needs to be processed the instance it becomes available are JMS (Java Message Service), RDBMS log sniffing, on-line fraud analyses, web or application …

[Read more]
Showing entries 1 to 3