Most developers are used to programming in procedural or object-oriented languages. SQL, as a declarative language, is quite different. In declarative languages like SQL, you program what you want the result to be, not the procedure to get it. For instance, “give me all the people with the first name starting with the letter S from a certain table.” Unlike procedural programming (or even methods in object-oriented languages), you do not say how to get the information. This is, I believe, why many developers want to give the query optimizer “hints” on how to do its job.
That being said, I will list the top 8 Basic SQL Practices I live by, and attempt to enforce. Please feel free to comment adding your own (or post your own, linking back here).
In no particular order:
1) Always use explicit joins. If I mean INNER JOIN, then I use INNER JOIN. No use of just plain “JOIN”. Never, ever, ever use a comma join — I consider that a mistake. If I explicitly state “CROSS JOIN” then I know I have consciously made that decision. Also, keep join conditions in an ON or USING clause; they should not go in the WHERE clause. I also put my join conditions in parentheses; for whatever reason, I find:
ON (foo=bar AND baz=bop) WHERE a=b
is easier to see that the join condition contains 2 conditions than
ON foo=bar AND baz=bop WHERE a=b
2) Always define field names. No using SELECT * or INSERT INTO table VALUES. It’s a pain, and more so of a pain given that mysqldump
does not specify INSERT fields. However, if it’s important enough to save in a text file (ie, it’s seed data or a migration script) then it gets explicit field names.
3) Always use the database server’s timestamp. Web servers may have disparate times. Reports may come from different servers than the inserted data.
4) Store IPs as integers with INET_ATON and retrieve them with INET_NTOA.
5) When doing reports, the network traffic is usually the biggest bottleneck. If you’re going to receive information, it’s better to receive in chunks, which will likely be larger than a logical piece. For instance, state reporting — instead of making 50 connections for states in the US, get them all at once. If the dataset is very large and folks do not want to stare at a blank page while the report is loading, use paging with LIMIT to grab, say, 1000 entries at a time and display them on the screen so people can start looking at the data while the rest is being grabbed.
6) Running a query in a loop is usually a bad idea. If you are executing the same query with different data, consider building a query string using UNION and executing it at the end of the loop, so you can execute multiple queries with only one trip across the network to the database.
7) Do not be afraid of JOINs. They are not necessarily resource intensive, given good indexing. Most of the time a denormalized schema without a join ends up being worse than a normalized one using a join. When there is redundant data, ensuring data integrity takes up more cycles than providing a framework for data integrity in the first place.
8) Limit the use of correlated subqueries; often they can be replaced with a JOIN.
(I also try to put SQL commands in capital letters to help me easily spot fields and variables I use).
I personally find SQL commands being in all caps is harder to read, as it’s not natural.
If i can perform a given query that can be acomplished even with a JOIN, or a Subquery. What should i use?
I’ve heard sometimes use subqueries is better because it’s reduced to a constant. But, is it entirely true?