• Facebook - Black Circle
  • Twitter Round
  • Instagram Black Round
  • LinkedIn - Black Circle

SEARCH BY TAGS: 

Please reload

RECENT POSTS: 

FOLLOW ME:

  • Facebook - Black Circle
  • Twitter - Black Circle
  • Instagram - Black Circle
  • LinkedIn - Black Circle

In-Memory OLTP for Noobs Part 1

May 11, 2017

 

In-Memory OLTP is new to me. I’m going to use this series of articles to gain a greater understanding of how it works and when/how to implement it.

 

 

 

What is it?

 

In Memory OLTP (aka Hekaton) was first introduced in SQL Server 2014. It has been touted to give great performance gains for workloads in highly transactional tables. It allows organizations to make the most of their hardware. Certain enhancements have been made as of SQL Server 2016, and it is now available in Azure (I will not be going in depth in Azure in this blog series).

 

Microsoft claims to have seen transactions performing 30x’s faster (but this can vary depending on your workload).

 

 

How does it work?

 

Basically transactions occur faster by removing lock and latch contention. This improves the efficiency of concurrency.  So it’s not just faster because the data is sitting in memory, but now the transactions that pull the data are optimized from the perspective of concurrency.

 

 

Now understand that certain objects can use In-Memory OLTP.  

Memory Optimized Tables -  basically just store user data.

Natively Compiled T-SQL Modules – stored procs, triggers, and scalar UDF’s can natively compiled at time of creation. Fewer CPU cycles are necessary to process.

Non-Durable tables -  think of it as temp tables. It’s used for transient data or intermediate data/result sets. It helps reduce I/O.

Memory-optimized table types -  can be used instead of traditional table types. Table variables in memory can help reduce I/O.

 

 

If it’s sitting in memory would you lose your transactions if something goes down? Luckily it is fully compatible with DR solutions such as AlwaysON and backup/restore.

As for the transactions, they are written to the log on disk and thus committed to the database. So in the event of a server failure In-Memory OLTP is being advertised as durable.

 

 

 

Why is it useful to an organization?


Well, if you’re in an environment that has high-throughput, yet low-latency transactions, In-Memory OLTP might be beneficial to you. Microsoft has stated that workloads from trading/financial applications, mobile gaming and even ad delivery have all benefitted from this feature.

 

It also sounds like a good way to get more mileage out of your hardware. We have so much memory that goes underutilized at times.

 

 

 

In subsequent blog posts I’m going to go into more technical detail and implementation in an effort to get a greater understanding of this new feature.

Share on Facebook
Share on Twitter
Please reload