Regular maintenance and upgrades are an essential parts of ensuring that databases and servers maintain their usability. Many common practices come with a high price and low results, and many developers have begun using alternative methods to improve the performance of their server.
Query tuning can fix queries that have been poorly written and indexes that are not efficient. Use these 10 tips to improve your database in a snap.
1. Separate Data and Log Files
This practice is often overlooked for no good reason. The data and log files should be separated into different physical drives arrays whenever possible, especially when using DAS or SAN. The purpose is to separate the random access of data from sequential access, that happens when writing transaction logs.
Sometimes, drives are carved from the same larger disk array rather than different physical drives, which defeats the purpose of separating them.
2. Limited Data Selection
A query will run faster when there is less data to be retrieved. Do as much filtering that you can on the server’s end. Less data will be sent and results can be seen faster. For example, even the smallest changes like removing the term “City” from “New York City” will improve the SQL server performance.
3. Don’t Shrink the Data Files
Shrinking data files is a bad practice in general and can impact performance in a few ways, including fragmentation and causing the queries to suffer. If Instant File Initialization isn’t turned on, the resulting growth can negatively impact performance and cause timeouts. There may be times when shrinking data files is needed, but be sure to know the impact before you take action on it.
4. Database Statistics
An optimizer will analyze the statistics and choose the least expensive path, in order to satisfy a query. Statistics that are outdated or missing will cause a prolonged response time because the path chosen will be less optimized.
5. Predetermine the Expected Growth
You can minimize the negative effect of indexes by specifying the appropriate fill factor value when indexing. When the value of a table is changed, the database has to reorganize the data stored to account for the new rows. If you are expecting regular additions of new rows, you can specify this growth for an index.
6. Alter the MAX Memory Limit
64-bit versions of SQL Servers have seen recent improvements in terms of allocating memory and sharing with an operating system, but leaving the MAX memory setting at default is not ideal.
You should set the MAX memory setting at a minimum of 1 or 2GB less than the total amount of memory on the server. How much memory you leave will depend on the programs that you have running and how much memory those require.
7. Stick With Single Changes
Rather than making many changes at one time and risking the effectiveness of your query, aim to find the most expensive queries and then adjust the query based on this. Making small changes one at a time will more likely produce optimal results.
8. Before Loading Data, Drop Indexes
This will aid the insert statement in running faster. Once the inserts are done, you can remake the index. Use a temporary table to load data if you are going to insert thousands of rows in a system. Moving data from one table to another is much quicker than loading it from an external source.
9. Utilizing Constraints
Advanced query analysis can benefit from knowing and using constraints. Avoid overlapping or duplicating indexes by reviewing existing indexes, keys and constraints. Information for existing indexes can be accessed by running sp_helindex.
10. Extra Columns
When there are extra columns within the database, the system will perform poorly. This is because, it becomes increasingly difficult for the execution of the process to use specific operations of the index. To find any additional columns, use the SELECT* or scalar functions to closely examine the query.