top of page

I optimize the speed, efficiency, and stability of SQL Server.
FOLLOW ME:
RECENT POSTS:
WHO AM I ?

My name is Aamir Syed and I run SQLEvo. I optimize the speed, efficiency, and stability of SQL Server. When not working I like to lift weights, play music, and travel with my wife.


Reading the Deadlock Graph
It's important to be able to analyze the deadlock graph once it's been detected. I'm going to give a brief run down on what I look at. The process nodes show information regarding the actual modification. The node that is X'd out is known as the deadlock victim. This is the process that was killed as part of the deadlock solution. Each process node shares the SPID, the deadlock priority (ranges from LOW: -5, Normal: 0, and High: 5), Log Use (the space the process uses in the


Book Review: SQL Server Query Performance Tuning by Grant Fritchey
My war torn copy. I spent the last few months looking like a total dork reading this book on the train. But I'm a rockstar when I get to my client's office thanks to much of the information found in this resource. The book doesn't just give you scenarios and how to react to them, rather it helps you to think analytically when facing certain SQL Server symptoms. Places to start and then you use your understanding of the process to direct the next troubleshooting steps. After r

Detecting Storage I/O Problems in SQL Server
So here's a scenario. You have a client that has implemented SAN storage for your SQL Servers (could be virtual or physical). Your manager tells you that ever since the SAN team got involved things are running slower! And He's sure it's a SAN problem. However, the SAN guys aren't going to do anything unless you provide them with some sort of proof. I'm the type that likes to compound common sense with actual metrics. I'm also the type that likes to work together to resolve th


Parameter Sniffing
Not always optimal... What is it? Let’s say you have a parameterized proc, and it creates and caches an execution plan based on certain values for said parameters. It “sniffs” the parameters and uses that for cardinality estimation. Then you run this stored proc again, but with different values for the parameters. SQL Server is going to use the execution plan that has been cached even though a different execution plan would be more optimal due to different values (and uneven


Consulting Series: The Value of Performance Tuning
What does the performance of SQL Server mean to you and your company? What are the benefits? How is the value perceived? Is it in dollars saved? Dollars earned? How about concepts that do not involve money? Peace of mind? The fact that the server will process these important transactions efficiently and accurately? You always know how long it will take to pull said report? It’s so quick that you can pull it often. Case Scenario: I spent some time working for a major financial

In-Memory OLTP for Noobs Pt. 2
So let's jump right in and see if we can't make some of the In-Memory OLTP features work. Then I can work on elaborating the details. Create Memory Optimized Filegroup I’m going to go through a process of creating a memory optimized filegroup, then we'll add a container to said filegroup. Create Tables: Then I will go ahead and create a traditional table, and a In-Memory optimized table, and add it to the appropriate filegroup. In the InMem table you'll notice that I create a
bottom of page