When you design a database, you must make sure that the database performs all the important functions correctly and quickly. Good database performance can be achieved implementing correct database design, hardware-based solutions, partitioning and capacity planning for database optimisation. Some performance issues can be resolved after the database is in production. However, other performance issues may be the result of a poor database design and can be addressed only by changing the structure and design of the database.
When you design and implement a database, you should identify the large tables in the database and the more complex processes that the database will perform. You should also give special consideration to performance when you design these tables. Additionally, you should consider the effect on performance by increasing the number of users who can access the database.
Example of design changes that improve performance
If a table that contains hundreds of thousands of rows must be summarised for a daily report, you can add a column or columns to the table that contains previously aggregated data to be used only for the report.
Databases can be over-normalised. This means the database is defined with several, small, interrelated tables. When the database is processing the data, it must perform far more work to combine the related data. This additional processing can reduce the performance of the database. In these situations, denormalising the database slightly to simplify complex processes can improve performance.