Building the long-term solution first is often wrong even when you need one.
I use the quick and dirty method until I know I'll do that task more regularly and I know enough about the problem so I don't need to make wild guesses about the design.
This is a clear example of what I was talking about. The fact that you use a given method doesn't mean other methods are "often wrong even when you need one". This is just the way you think and work, in the position you are now.
If your boss at the workplace you've been working probably "within the month" or a little longer tells you to write a program to do X, you'll maybe pulling all your knowledge out to write a program to do X as if it were the most important program in the world.
Learning to work "quick and dirty" is something I'm actually working on improving, personally.
I don't necessarily do BDUF or anything, but I generally do more "engineering" than is often directly necessary. Partially, it's my personality. But part of it is that in my experience, prototypes have a way of becoming production systems.
Last time I had to do a prototype for work, I split the difference: working quick and dirty, just meeting the minimal requirements, but also documenting what we would need to do if we decided to move forward with it. That felt like a good balance.
> First task is getting some static data out of a few dusty html tables deep in the intranet and dumping it on a new table in the dev DB
For something like this, the first thing I'd do is extract the first row manually in Vim, recording my actions as a macro, then use that macro to extract the rest. That would take a few minutes tops, depending on the structure of the data it might take seconds or there might be a bit more massaging necessary, but no more than about 5 minutes for this step. I have a keybinding that maps execution of the macro in the q register to the spacebar to streamline this process; I double-tap q to start recording, press q again to stop recording, and press/hold the spacebar to execute that q macro. In my experience, 2 or 3 passes are usually enough to cleanly extract a table of data from some random HTML page.
If the job was a once-and-done, then I'm already done. If I am meant to be creating something that can be re-used later, then I still do the above anyway and now I have the result with which to test whatever more permanent solution I'm writing.