in reply to SOLVED: Storing UTF-8 data into database from scraped web page
Another trick that can help in diagnosing problems like these is to write a script that dumps the hexadecimal bytes. There are just too many places where encoding can be attempted by different participating pieces of software all of which are trying their best to be helpful. I have even been known to go to the actual disk-files where the information is stored and display some of the pages with the hexdump command. (Likewise the incoming data from the site: "wget" it with all language-features turned off and hexdump that file. Determine what are the bytes that are coming in from the site and what are the bytes that are getting written to disk. From this you can piece together where conversion is happening and where you want it to be happening.