As one of those individuals who might have a knack for it (but not a love for it - I am passionate about finding solutions to technical problems; coding is one of the tools that I use), I have found one thing that has served me well, especially after too many of those 15 hour stints in a row (or longer), is having a process under your fingers that you can just run on autopilot to help with the little things.
On a recent ETL project I was on, there were a few deadline-induced long days, especially when getting through a lot of the cleansing tickets (this was over a century of data, 20-25 years in digital form, from various sources and states of cleanliness), this was the process:
- Take the next ticket in the queue
- Find a representative dataset that reproduces it
- Incorporate the representative dataset into a test
- Ensure the test fails
- commit test
- Fix until all tests pass
- commit code
- refactor if needed, ensuring all tests pass
- If a new bug is found while fixing what you are currently working on, log a ticket
- compare the results of your cleanup process to the previous run. If necessary, fix your test and refine your implementation (or log a bug). This was very important when trying to ensure that there were no cleanup changes to the dataset beyond what you intended. A temporary Git repo + git diff... can work well with a specific type of data.
- commit code with a tag to indicate that it is ready to test for deployment
- Repeat
The mechanics start to become automatic, and the limited caffeine-fueled brain cells that are left functioning can focus on the important aspects. And do not take this as a suggestion that being in that mental state is good, recommended, or a prudent practice. Sometimes it happens.