In my first post about the Sony VAIO PCV-J120, I was focusing on the hardware. After receiving the missing components (extra memory, additional hard-drive, FireWire, TV-Tuner, and modem cards), I went ahead and installed Windows 98 SE. I forgot how painful the simple install could be! If you forgot – as I did -, you first need to boot off the Windows 98 CD. Of course, the install will fail, so you have to switch from the installation method prompt to the limited MS-DOS. From there, cd into the win98 folder and use the fdisk application to create primary DOS partitions (that you need to activate).
Then, format the partitions (at least the C:\). Ah, also, copy all the Win98 folder onto one of the freshly formatted partitions. Then only, from this copy, start the setup! We had to be motivated! And of course, each time there is a need for an additional file, you will need to re-point to the CD copy made earlier…
But all this is simple compared to what comes after installing the missing drivers. It is not the installation that is problematic; it is finding the right driver. I imagined that I would find all the drivers and even more on the Internet. Well, I could not be more wrong. Network card driver, various chipset components drivers, etc. It was a pain, and I still have a couple of yellow-bangs I need to resolve.
The other issue I faced during this exercise concerned access to the Internet. It became so pervasive that if you also wish to enjoy its benefits you need to jump through a few extra hoops. Of course, any browser from the ‘90s will fail to render anything today. I am currently, with some success, exploring the use of KernelEx which also requires installing the Microsoft Layer for Unicode on Windows 95, 98, and Me Systems, 1.1.3790.0. After this, you can installer a bit less-outdated version of Firefox, Opera, etc. I tried Opera 9.6 with limited success. Still some work to be done, especially on supporting HHTS. Finally, I also installed an anti-virus that still works under Windows 98. I went with the active open source ClamWin project. I only focused on the painful aspects of building such a system from scratch in 2020. But, once we are done, a vintage PC is a lot of fun. Especially when you run vintage apps and games. I appreciate running the Borland Turbo dev tools. Yeah, Turbo FORTH 83 and C 2.1!
3 thoughts on “It’s a Sony! – Part 2”
Since developers are assuming evergreen OSes and browsers nowadays, the prospects of accessing any web content with anything older than a few months (esp. legacy systems) won’t get better anytime soon. How could we deal with this?
What about setting up a proxy server and re-transpiling any JS and CSS content on-the-fly to something a Mozilla browser may understand, the code base of which is a few years old?
It may be worth assembling a Raspberry Pi image for easy setup and distribution. (Of course, we would need to update the transpilers every so often, keeping them ever-so-greenish, probably by the use of an automatic update script.)
You are absolutely right Norbert, it is sort of mission impossible as-is. With your transpiler / proxy idea, however, it is feasible. Even if the transpiler would remove items it knows the antique browser cannot handle would be a plus. The use of the Raspberry reminded me of the Pi-Hole project.
It may be a coincidence, but about the time when evergreen browsers (monthly major revision updates) became a thing, maintenance for most legacy targeted browsers stopped. There’s some obvious conflict.
I guess, tools like Babel still target a JS standards that most of these legacy browsers support. (However, there may be an issue with code aggressively assuming the presence of some objects and/or properties or even entire APIs that didn’t exist a few years ago. To be on the safe side, we’d need some dummy objects for those as well. But this is probably an edge case. Getting around arrow functions, let, const, and maybe iterators, should solve a major part of practical issues.)
Combining this with Pi-Hole would be another win, by this getting rid of trackers an adds and all the performance issues, which go with them. Another issue are newer versions of http and certificates, which could also be handled transparently by a proxy.