I have no clue as to why your browser's behavior changed, but have you considered using web scraping software (en.wikipedia.org) to download the files to a folder on your hard drive and then concatenating the files together?
I can't really recommend a particular web scraping software, since I have no experience in that area, but I can tell you how to concatenate all of the files in a directory together for a Windows machine. Also, since I don't know how savvy you are about batch files and DOS commands and we may have other people find this later that have no experience with this stuff, I will try not to assume that you know anything. Please don't feel insulted if I tell you something that you think *everyone* knows.
First, you should create a folder to put the data files in. I just created a folder called testing on my desktop, but you can use whatever name you want and put it wherever you want also. Within the folder that you created, create a second folder called archive. So, for me (on WinXP, sp2), when I have the archive folder open, the address near the top of the folder looks like this:
C:\Documents and Settings\aerinber\Desktop\testing\archive
Note: aerinber is my login name on this computer, so your address will not match mine exactly, even if you do exactly what I do.
You may prefer to name your data folder data and put it in My Documents. My Documents is actually an alias for something like C:\Documents and Settings\aerinber\My Documents, so if you have trouble finding the *real* name of your My Documents folder, you may want to use some other location. Anyways, If you do put your data folder in My Documents, the address of your archive folder could be similar to this:
C:\Documents and Settings\aerinber\My Documents\data\archive
After you have created your data folder and archive folder, copy and paste the following script into Notepad or WordPad
Originally posted by archive.batfor /f "tokens=1-4 delims=/ " %%a in ('date /t') do (set weekday=%%a& set day=%%b& set month=%%c& set year=%%d)
...and save it as "archive.bat" in the data folder that you created. Make sure you use the double quotes around the file name or Windows will think that you want to name the file archive.bat.txt and you need the .bat extension on the end.
Note2: You may need to edit this script if you aren't copying files that end in .txt. For example, if you are copying web pages, they may end in .html. If that is the case, everywhere the script has .txt needs to be replaced with .html before this script will work correctly.
When you set up the web scraper software, designate your data folder as the destination for the files that it creates/downloads or you will need to cut them from the folder that they are created in and paste them to the data folder with archive.bat in it. When you run your web scraper and it the files you want are in the data folder, double click on the icon for archive.bat.
When you run it, archive.bat will
delete a file called current_data.txt if it exists
concatenate the data from all files that end in .txt in the directory that archive.bat is running in
put the concatenated data in a file called data.txt
copy all files in that directory that end in .txt into the archive folder
add the date to the front of the data.txt filename
and copy it back to the data folder with a name of current_data.txt
Note3: If you run archive.bat more than once in a day, it will not be able to rename data.txt to the dated name since that file already exists. That means you will still have a file in the archive directory named data.txt, but it will go away when you run archive.bat on another day, so it's not really a problem or anything.
You probably will still need to open the current_data.txt file and make some minor formatting edits and keep in mind that the archived data file will not have those edits unless you open that file and make the edits there too, but if this works, it would make your job way easier. Good luck and if you decide to try this, let me know if the instructions need clarification.
I appreciate all of the help guys. I'm going to try the suggestions in hopes of streamlining my process. I guess my only concern s that on some of the particular pages, the data is out of order and I have to specifically look at each individual listing on each page for the specific data I need.
I hate to bore the board, and/or waste CRZ's and Zim's bandwidth, so if anyone wants to help me out offline with making my little side gig easier through private messages, I'd be more than willing to compensate them for their time via Paypal.
As always, thanks for the guidance...this board amazes me with the stuff I've gotten off of it over the years.
Well, apparently, google is great but it filters out tons and tons of searches to give us the best results but what if we really want to learn something new? May be this will be the answer. http://blog.wolfram.com/2009/03/05/wolframalpha-is-coming/