trapped in a pile of diodes. SAY NO TO CRYPTO! Articles from The Retro Web (https://blog.theretroweb.com) and writeups for videos made by @themajortechie, of whom this is totally not a sideblog for.
Last active 2 hours ago
Don't wanna be here? Send us removal request.
Text
Further notes 6.16.2024:
Welp. I did a big dum.
So. One-Core-API. I'd completely forgotten that I was already trying it out earlier this year when Minecraft snapshot 24w14a released. Java 21 runs using One-Core-API, but uses some function from kernel32.dll that either is unimplemented or is bugged in the current version. As a result, Minecraft is guaranteed to crash upon world load, every single time.
Even stranger, it appears that Minecraft, at least using ATLauncher, fails to launch with more than 384MB of ram allocated. (EDIT: this happens with all launchers.)
Both of these are issues that do not occur on Windows 7. I'll be switching from XP to 7 in the next reblog.
Writeup: AOpen i945GMm-HL shenanigans
AOpen i945GMm-HL - The Retro Web
Welp. This board is weirder than I ever thought it'd be. Not the board in general, but the specific one I bought.
To begin, it turns out that my particular board, and likely many others of the same model, are OEM-customized boards that AOpen provided to a little company called RM Education. They make all-in-one PCs for the UK market.
...And they are using evaluation BIOSes (in other words, BIOS software that's normally only meant for prototyping and... well, evaluation) in their retail boards.
My specific board contains BIOS version R1.08, which is actually R1.02 apparently. There is evidence of an R1.07 existing as well from a reddit thread on the r/buildapc subreddit, but I doubt that it's been dumped anywhere.
Moving on to the original point of this writeup, I got this board because I wanted to build a system that pushed the 32-bit Core Duo T2700 as far as possible, meaning I needed a mobile-on-desktop board. AOpen built a reputation for doing this sorta stuff in the 2000s, so I went ahead and picked one of their boards for use (although I would've much preferred using the top of the line AOpen i975Xa-YDG instead if it were being sold anywhere. That's a VERY tasty looking board with its full size DIMM slots and SLI-compatible dual PCIe x16 slots and ability to crank the FSB all the way to 305MHz).
Slightly surprisingly, the Core Duo T2700 is quite the overclocker! It's able to push from 2.3GHz all the way up to 2.7GHz with some FSB overclocking using the SetFSB tool. It's multiplier-locked to a range from 6.0 to 14.0, so I can only push it through this means.
The board I'm using, the AOpen i945GMm-HL, supports running the FSB up to 195MHz. It's okay-ish in terms of stability, but crashes when running Aida64 benchmarks unless I loosen the memory timings from the 5-5-5-15 settings that it uses at 333MHz to 5-6-6-18, which is just the tiniest bit faster than its stock settings for 400MHz operation by SPD. With these settings, it's much more stable and is able to run the benchmarks, though unless I lower the FSB from 195MHz to 190, it will consistently crash Chrome when trying to play Youtube videos on integrated graphics. I'll likely experiment some to see if adding a card capable of handling the video playback in hardware helps.
For now, this is all for this blog post. I'll follow-up with more details as they come in reblogs. As follows are the specs of the system:
AOpen i945GMm-HL (OC'ed from 166MHz FSB to 195MHz, 190MHz for more stability)
Intel Core Duo T2700 @ 2.7GHz (OC'ed from 2.3GHz)
2x 2GB Crucial DDR2 SO-DIMMs @ 5-6-6-18 timings
Some random 40GB Hitachi hdd lol
Windows XP Pro SP3, fully updated via LegacyUpdate
Supermium Browser (fork of Google Chrome and the reason why I was able to test Youtube playback in the first place)
Coming up: Installing One-Core-API and Java 21 to play Minecraft 1.21 on a 32-bit system out of spite for Microsoft "dropping support" for 32-bit CPUs.
2 notes
·
View notes
Text
Writeup: AOpen i945GMm-HL shenanigans
AOpen i945GMm-HL - The Retro Web
Welp. This board is weirder than I ever thought it'd be. Not the board in general, but the specific one I bought.
To begin, it turns out that my particular board, and likely many others of the same model, are OEM-customized boards that AOpen provided to a little company called RM Education. They make all-in-one PCs for the UK market.
...And they are using evaluation BIOSes (in other words, BIOS software that's normally only meant for prototyping and... well, evaluation) in their retail boards.
My specific board contains BIOS version R1.08, which is actually R1.02 apparently. There is evidence of an R1.07 existing as well from a reddit thread on the r/buildapc subreddit, but I doubt that it's been dumped anywhere.
Moving on to the original point of this writeup, I got this board because I wanted to build a system that pushed the 32-bit Core Duo T2700 as far as possible, meaning I needed a mobile-on-desktop board. AOpen built a reputation for doing this sorta stuff in the 2000s, so I went ahead and picked one of their boards for use (although I would've much preferred using the top of the line AOpen i975Xa-YDG instead if it were being sold anywhere. That's a VERY tasty looking board with its full size DIMM slots and SLI-compatible dual PCIe x16 slots and ability to crank the FSB all the way to 305MHz).
Slightly surprisingly, the Core Duo T2700 is quite the overclocker! It's able to push from 2.3GHz all the way up to 2.7GHz with some FSB overclocking using the SetFSB tool. It's multiplier-locked to a range from 6.0 to 14.0, so I can only push it through this means.
The board I'm using, the AOpen i945GMm-HL, supports running the FSB up to 195MHz. It's okay-ish in terms of stability, but crashes when running Aida64 benchmarks unless I loosen the memory timings from the 5-5-5-15 settings that it uses at 333MHz to 5-6-6-18, which is just the tiniest bit faster than its stock settings for 400MHz operation by SPD. With these settings, it's much more stable and is able to run the benchmarks, though unless I lower the FSB from 195MHz to 190, it will consistently crash Chrome when trying to play Youtube videos on integrated graphics. I'll likely experiment some to see if adding a card capable of handling the video playback in hardware helps.
For now, this is all for this blog post. I'll follow-up with more details as they come in reblogs. As follows are the specs of the system:
AOpen i945GMm-HL (OC'ed from 166MHz FSB to 195MHz, 190MHz for more stability)
Intel Core Duo T2700 @ 2.7GHz (OC'ed from 2.3GHz)
2x 2GB Crucial DDR2 SO-DIMMs @ 5-6-6-18 timings
Some random 40GB Hitachi hdd lol
Windows XP Pro SP3, fully updated via LegacyUpdate
Supermium Browser (fork of Google Chrome and the reason why I was able to test Youtube playback in the first place)
Coming up: Installing One-Core-API and Java 21 to play Minecraft 1.21 on a 32-bit system out of spite for Microsoft "dropping support" for 32-bit CPUs.
2 notes
·
View notes
Text
Writeup: The Great(?) OpenGL Wrapper Race
Somehow, I always find myself gravitating back to the S3 ViRGE/DX.
This time around, rather than placing total focus on the ViRGE itself, we'll be taking a look at some OpenGL wrappers!
This writeup will be updated along the way as more videos release.
The setup is as follows:
Matsonic MS7102C
Windows 98SE RTM with the following patches/update: KernelEX 4.5.2, NUSB v3.3e, Windows Installer 2.0, DirectX 9.0c
Intel Pentium III (Coppermine) @ 750MHz
S3 ViRGE/DX @50MHz w/4MB EDO DRAM (using the S3 Virge "SuperUni" drivers)
256MB PC-133 SDRAM (Kingston KVR133X64C3/512) (System can only handle 256MB per DIMM)
Sound Blaster AWE32 CT2760
Some random 5GB Hitachi 2.5" HDD that I "borrowed" from a very dead laptop
Java 6 (1.6.0) build 105/Java 6u0
Wrappers to be tested:
S3Mesa - a wrapper based on the Mesa project. It's a full OpenGL implementation sitting on top of S3D and Direct3D, but from the available source code appears to be missing some functionality and is quite unstable.
AltOGL - also a wrapper based on the Mesa project, but relies solely on Direct3D. It is similarly missing some functionality, but is much more widely compatible with cards beyond the Virge thanks to its lack of reliance on S3's proprietary API.
Techland S3D - one of the many wrappers made by Techland for their Quake II engine-based "Crime Cities" game. Although it like S3's own GLQuake wrappers only implements as much of the API as is needed by the engine, it still implements far more features than S3's DX5 and DX6-based wrappers, of which are not tested in this wrapper race.
Techland D3D - like AltOGL, Techland's D3D counterpart to their S3D wrapper implements a subset of the OpenGL spec, but still enough to be compatible with a number of titles beyond Quake II engine games.
GLDirect 1.x - A very early version of GLDirect. There exists a license code for this version floating around on the internet that appears to have been used for internal testing by Shuttle, a PC manufacturer that's largely fallen out of relevance and mainly does industrial PCs nowadays.
GLDirect 3.0.0 - One of the last versions of GLDirect to support hardware acceleration on DX6-class graphics cards.
Things tested
WGLGears
ClassiCube 1.3.5 and 1.3.6
Minecraft: Indev in-20091223-1459, Alpha 1.0.4, Beta 1.7.3 (with and without Optifine), Release 1.5.2
Tux Racer
GL Excess benchmark
Half Life 1 v1.1.1.1 KingSoft NoSteam edition
Findings
GLDirect 1.01
OpenGL Version String
Vendor: SciTech Software, Inc. Renderer: GLDirect S3 Inc. ViRGE/DX/GX Version: 1.2 Mesa 3.1
Textures do not work at all in every test case besides Half Life.
ClassiCube 1.3.5 and 1.3.6 both fail to render any terrain beyond a greyish horizon and the outlines of blocks. All blocks, items, and text that are able to be rendered are pure white in color and have no textures applied.
Minecraft Indev in-20091223-1459 and Beta 1.7.3 with Optifine crash upon world-load
Minecraft Alpha 1.0.4, Beta 1.7.3, and Release 1.5.2 all crash upon launch.
Tux Racer is able to render 2D textures such as text and graphics, but flickers INTENSELY and is a seizure risk. Beyond this, however, the game will only render a solid white screen.
Half Life launches and runs, but at a terrible 4 FPS. The game lags hard enough that the tram in the opening area of the game is frozen where it is, preventing the player from accessing anything beyond the intro cutscene.
GL Excess crashes instantly.
Performance
GLGears: ~76 FPS
ClassiCube 1.3.5/1.3.6: Unknown; game fails to render
Minecraft in-20091223-1459: Unknown; world crash
Minecraft Alpha 1.0.4: Unknown; crash on game launch
Minecraft Beta 1.7.3: Unknown; crash on game launch
Minecraft Beta 1.7.3 w/Optifine: Unknown; world crash
Minecraft Release 1.5.2: Unknown; crash on game launch
Tux Racer: Unknown; game fails to render in a VERY seizure-inducing way
Half Life: ~4 FPS; gameplay outside of training room is broken
GL Excess: Unknown; instant crash
GLDirect 2.00
youtube
From here on, GLDirect is split between "Game" and "CAD" wrappers, denoted by either a "G" or a "C" after the version number where written in this writeup.
OpenGL Version String (2.00C)
Vendor: SciTech Software, Inc. Renderer: GLDirect S3 Inc. ViRGE/DX/GX Version: 1.2 Mesa 3.1
OpenGL Version String (2.00G)
Vendor: SciTech Software, Inc. Renderer: GLDirect Version: 1.1
GLDirect 2.00C likes to complain about insufficient color precision.
Changing the color precision from 24-bit up to the maximum 32-bit color does absolutely nothing.
The CAD wrapper very clearly is intended for non-gaming workloads given how easily it crashes things. However, it is strange that it is labeled as a "maximum compatibility" wrapper/driver.
I am not using the SciTech GLDirect 3.0 driver beta seen in the video because it requires a card capable of higher DirectX versions than 6, which is what the S3 ViRGE supports. I may revisit this video idea with a later graphics card in a future video for more thorough testing.
Using the 2.00G wrapper, Minecraft Alpha and Beta have many visual bugs. Text is not rendered at all, for example, and the world selection screen is eerily dim compared to what it should be. Beta in particular extends this darkness to the title screen as well.
Under 2.00G, Minecraft Beta 1.7.3 with Optifine no longer has this darkness.
Under 2.00G, Minecraft Release 1.5.2... inverts the colors of the Mojang logo?
Did you know that if you fall out of the Half Life intro tram in just the right place, you get to see the multiverse?
The framerate starts to rise when this happens, as it appears that the tram after becoming unstuck will trigger the next loading screen. Unfortunately, this loading screen appears to be what seals your fate. Once the tram stops though you at least get to meet biblically-accurate Half Life rendering in a smooth double-digit framerate!
Performance (2.00C)
GLGears: 63 FPS
ClassiCube 1.3.5/1.3.6: Unknown; game fails to render
Minecraft in-20091223-1459: Unknown; crash on game launch
Minecraft Alpha 1.0.4: Unknown; crash on game launch
Minecraft Beta 1.7.3: Unknown; crash on game launch
Minecraft Beta 1.7.3 w/Optifine: Unknown; crash on game launch
Minecraft Release 1.5.2: Unknown; assumed crash on game launch based on previous versions' behavior
Tux Racer: Unknown; game fails to render (no longer seizure inducing at least)
Half Life: Unknown; crash on game launch
GL Excess: Unknown; instant crash
Performance (2.00G)
GLGears: 390 FPS; only a black screen was rendered.
ClassiCube 1.3.5/1.3.6: 10-30 FPS range, ~12-15 FPS on average; most of the game still does not render, but text, the hotbar, hand items, and very occasional flickers of geometry do render. Seizure warning.
Minecraft in-20091223-1459: Unknown; crash on world load
Minecraft Alpha 1.0.4: Unknown; crash on world load
Minecraft Beta 1.7.3: Unknown; crash on world load
Minecraft Beta 1.7.3 w/Optifine: Unknown; crash on world load
Minecraft Release 1.5.2: Unknown; crash on game launch
Tux Racer: Unknown; crash on game launch
Half Life: 4-5 FPS; game physics are almost entirely broken down and I'm pretty sure you end up phasing into a higher plane of existence along the way. Trying to enter the training room crashed the entire PC afterwards.
GL Excess: Unknown; instant crash
GLDirect 3.00
youtube
OpenGL Version String (3.00G)
Vendor: SciTech Software, Inc. Renderer: GLDirect Version: 1.1
OpenGL Version String (3.00C)
Vendor: SciTech Software, Inc. Renderer: GLDirect S3 Inc. ViRGE/DX/GX Version: 1.2 Mesa 3.1
GLDirect 3.00, both the CAD and Game versions, appears to behave identically to GLDirect 2.00 in almost all cases unless stated otherwise.
Performance (3.00G)
GLGears: 249 FPS; gears are rendered completely incorrectly
ClassiCube 1.3.5: 15-20 FPS on average; most of the game fails to render and the system hard-crashes after a few seconds.
ClassiCube 1.3.6: Insta-crash.
Minecraft: Crash on world load across all versions. Didn't bother testing Release 1.5.2.
Tux Racer: Unknown; crash on game launch
Half Life: ~4 FPS; extremely choppy audio in tutorial level.
GL Excess: Unknown; instant crash
Performance (3.00C)
GLGears: 80 FPS; Perfectly rendered
ClassiCube 1.3.5/1.3.6: Unknown; renders a white screen
Minecraft: Crashes on game launch. The game may also complain about color depth here.
Tux Racer: Unknown; renders a white screen
Half Life: Unknown; renders a white screen and then crashes on game launch
GL Excess: Unknown; instant crash
Techland S3D
We've now moved on from GLDirect! From here on out, each wrapper is instead a discrete opengl32.dll file that must be dropped into the folder of whatever program you'd like to run it with.
OpenGL Version String
Vendor: Techland Renderer: S3 Virge 3093KB texmem KNI Version: 1.1 beta 6
Right off the bat, things appear to be taking a turn for the interesting as GLGears fails to render anything.
Performance
GLGears: 60 FPS, but only because a black screen is rendered.
ClassiCube 1.3.5/1.3.6: Crashes on game launch but renders a solid blue screen.
Minecraft in-20091223-1459: We load into a world! Have fun trying to play it though. Rendering is very flickery and broken. It may be possible that there's an issue of some kind with z-buffering. 12-15 fps if that matters at all in this case.
Minecraft Alpha 1.0.4: Crashes on game launch.
Minecraft Beta 1.7.3: Renders an inverted vignette that slowly grows darker.
Minecraft Beta 1.7.3 w/Optifine: Crashes on world load.
Minecraft Release 1.5.2: Rendered the title screen with many errors for a brief moment before turning to a black screen and crashing.
Tux Racer: Unknown; renders mostly a white screen. The game does respond to user inputs, and the rendered scene changes based on those inputs, but no textures or complex objects are ever rendered. Instead, you only get the white "floor" plane, a solid blue skybox, and translucent boxes where text, objects, and particles should've been.
Half Life: Crash on game launch; absolutely RAVAGES the title screen in ways I thought only the Alliance AT3D could.
GL Excess: Actually loads! But renders only solid black or white screens after the initial loading screen.
Techland D3D
youtube
Two more wrappers left after this one! This wrapper and the Techland S3D wrapper that came before it were both created by Techland originally for their "Crime Cities" game, which, being an OpenGL title during an era where support was spotty at best across the dozens of vendors that existed at the time, necessitated the creation of a set of OpenGL wrappers that could translate calls to other APIs.
Anyway, I originally thought that there wasn't going to be much that'd be interesting about this wrapper based on how Minecraft (didn't) work, but I was quickly proven wrong in the things I tested afterwards!
Vendor: Techland Renderer: Direct3D (display, Primary Display Driver) KNI Version: 1.1 beta 6
Performance
GLGears: ~290 FPS, but only because a black screen is rendered.
ClassiCube 1.3.5/1.3.6: 2-5 FPS. Runs with z-buffering issues. Faster than software rendering, but not by much (SW rendering was roughly 1-3 FPS)
Minecraft: Renders only a black screen and (usually) crashes.
Tux Racer: Unknown; renders a mostly-white screen. However, shading works well enough that you can just barely distinguish what is what.
Half Life: Crash on game loading after the title screen; absolutely RAVAGES the title screen in ways I thought only the Alliance AT3D could.
GL Excess: Actually runs! Performance is alright in some parts but generally remains low. Also, it erroneously uses the exact same texture for every single rendered object.
S3Mesa/AltOGL
youtube
So, it turns out that there may or may not be some level of bugginess involved with S3Mesa and AltOGL on my current testing setup. Whereas both wrappers were (somewhat) stable and able to load into both Minecraft and Half Life with relatively little trouble in previous experiments with AltOGL/S3Mesa and the Virge, this time around it doesn't appear to be working as expected. There may be something that got screwed up thanks to having installed and used multiple different versions of GLDirect prior to using S3Mesa and AltOGL.
The best-case scenario would have been to start off with a fresh install of Windows 98 with every test run with a different wrapper, but in this case I didn't do that. As a result, the findings for S3Mesa and AltOGL here should be considered with a grain of salt.
Vendor: Brian Paul Renderer: Mesa S3GL V0.1 Version: 1.1 Mesa 2.6
Vendor: Brian Paul Renderer: altD3D Version: 1.2 Mesa 3.0
Performance - AltOGL
GLGears: 125 fps
ClassiCube 1.3.5/1.3.6: crash + system lockup
Minecraft in-20091223-1459: crashes on world load (normally it's much more stable according to previous videos. See here for reference: Indev on S3 ViRGE using AltOGL
Minecraft: Aside from Beta 1.7.3 with Optifine which crashes on world load, all versions of the game crash instantly.
Tux Racer: Freezes on main menu
Half Life: Instacrash
GL Excess: Instacrash
Performance - S3Mesa
GLGears: i forgor to run it 💀
ClassiCube 1.3.5/1.3.6: no crash but everything is rendered in solid white rather than with textures. performance is likely in the single digits.
Minecraft in-20091223-1459: crashes on world load (normally it's much more stable according to previous videos. See here for reference: Indev on S3 ViRGE using S3Mesa
Minecraft: Aside from Beta 1.7.3 with Optifine which crashes on world load, all versions of the game crash instantly.
Tux Racer: Renders just about everything except for the game terrain and player character. Performance seems to just barely reach double-digits.
Half Life: Instacrash; textures missing on main menu (similarly to Minecraft indev, this is an unexpected result as the game normally is able to play.)
GL Excess: Instacrash
Misc Notes
Running HWinfo32 crashes the system regardless of if KernelEX is enabled or not.
GLDirect 5 refuses to do anything other than software rendering due to the lack of higher DirectX support by the Virge.
GLDirect 3 can "run", but crashes the entire system after a few seconds in Classicube using the "game" wrapper.
GLDirect 3, unlike 5, has no license code available for free use.
I didn't have the opportunity to record during the time GLDirect 3's trial was active, and it expired already so to avoid having to reinstall everything to get a new trial (I've already tried to roll back the system calendar), I will instead be using GLDirect 2, which I have in fact found a license code for.
GLDirect 2 has a fourth option for OpenGL acceleration beyond the CAD/Game DirectX hardware acceleration and the CPU software acceleration, which is a DX8-based wrapper that appears to have later debuted fully in GLDirect 3.x and up. I think it's safe to assume that the CAD/Game DX6-based wrappers were then deprecated and received no further development after GLDirect 2.x given the pivot to a newer DirectX API.
There are a number of other wrappers available for the S3 ViRGE, and even an OpenGL MCD for Windows 2000. However, I'm sticking with the five that were listed here as they implement enough of the OpenGL spec to be (more) widely compatible with a number of games. The S3Quake wrapper, for example, implements only enough of the spec to run GLQuake, and I have never gotten it to even launch any other titles.
I really, sincerely didn't expect MC Beta 1.7.3 to launch at all even with Optifine, given how from prior testing I found that the game would instantly crash on anything higher than Classi--nevermind, it just crashed on world-load. (S3Mesa, AltOGL, TechlandD3D, TechlandS3D, GLDirect 1.x)
Non-Optifine Beta 1.7.3 crashes before even hitting the Mojang splash screen. (S3Mesa, AltOGL, TechlandD3D, GLDirect 1.x)
Non-Optifine Beta 1.7.3 gets into a world! All text is missing though as are the blocks. Performance is expectedly horrendous. (TechlandS3D)
Making batch files to handle version-switching instead of doing it by hand is nice.
Alongside the system setup for testing apparently no longer being ideal for using S3Mesa and AltOGL, comparison against some test runs done on an Athlon XP 1.1GHz system also reveal that the Virge performs far better with a faster CPU like the aforementioned Athlon XP than the 750MHz Pentium that this series of experiments is built upon. This experiment as a result may be eventually redone in the future using that faster CPU.
0 notes
Text
Writeup: To boldly stumble… pushing the Alliance AT3D to its limits
Yeah. We're venturing into levels of absolute jank that none have ever gone before.
This writeup is also available on The Retro Web!
Fun fact, this card singlehandedly killed Alliance Semiconductor's graphics division! All three successor cards that were planned for release after the AT3D were promptly scrapped and never heard of again. Alliance themselves would crumble not long after, and at present is a shell of its former self that manufactures only memory chips as opposed to… well, mainly memory chips and a few other things on the side.
Aaaaaaanyhow, let's get on with this with a quick spec-dump.
Alliance AT3D - specs
Year released: 1997
Core: Alliance AT3D, unknown manufacturing node, 61MHz
Driver version: 4.10.01.2072
Interface: PCI
PCI device ID: 1142-643D / 0000-0000 (Rev 02)
Mem clock: 61MHz real/effective
Mem bus/type: 4MB 64-bit EDO, 488MB/s bandwidth
ROPs/TMUs/Vertex Shaders/Pixel Shaders/T&L hardware: 1/1/0/0/No
DirectX support: DirectX 6 (DX 3/5)
OpenGL support: - Native: no support - Techland OGL to D3D: --100% OpenGL 1.1 compliant --12% OpenGL 1.2 compliant --0% compliant beyond OpenGL 1.2 --Vendor string:
Vendor: Techland Renderer: Direct3D (display, Primary Display Driver) KNI Version: 1.1 beta 6
OpenGL support (continued) - AltOGL OGL to D3D: --100% OpenGL 1.1 compliant --100% OpenGL 1.2 compliant (but I highly doubt it) --Vendor string:
Vendor: Brian Paul Renderer: altD3D Version: 1.2 Mesa 3.0
As for the rest of the system...
Windows 98 SE w/KernelEX (no updates)
Matsonic MS7102C
Intel Pentium III (Coppermine) @ 750 MHz
256MB PC-133 SDRAM (single stick of Kingston KVR133X64C3/512, can't extract SPD data bc system crashes)
Hitachi 4GB Microdrive
Some random Slot 1 Cooler Master CPU cooler
And with that out of the way, onto the notes!
So. Uh, yeah. The Alliance AT3D, and more specifically this AT3D, is a very... VERY strange card. Despite releasing rather late for a 3D-capable graphics chip in comparison to the competition, the AT3D is very clearly half-baked at best, and a flaming dumpsterfire at worst. I'm not sure if it's the hardware itself or drivers written by the world's worst driver dev team to have ever existed, but there is something very, very wrong with the 3D rendering capabilities that this card has.
As implied by the specs of the card from up above, the AT3D has no native OpenGL support. Or native DirectX 6, for that matter. Windows 98 just happens to like to stamp DX6 support onto cards that don't support anything higher. This card was released targeting Windows 95, and drivers for Windows 98 and up were never made available. Luckily, with how similar the two OSes are at the kernel level, the Win95 drivers are fully-compatible with '98. Yes, that is in spite of the atrocious 3D rendering.
So, anyway. OpenGL. That was what this video was intended to focus on, but between me catching Covid back in August and only finally recovering enough to begin recovering by late September, the video ultimately ended up as this congealed pile of rambling and chaos, with plenty of Windows 98 crashes sprinkled in for flavor.
ClassiCube runs... okay, I guess, if you're willing to look past the amazing rendering quality. AltOGL crashes ClassiCube 1.3.5, and though I've been told that 1.3.6 works with AltOGL, with the AT3D at least it still crashes.
So, instead of AltOGL, I am using Techland's OpenGL to Direct3D wrapper. Though intended to be a "mini-GL" for their Crime Cities game, Techland's wrappers have a decent reputation for speed and compatibility among low-end cards. Though, the AT3D is clearly an outlier on both fronts.
Minecraft itself is unable to launch with either wrapper. With the Techland wrapper, the game complains about not being able to create an OpenGL context, which isn't too surprising given how the Techland wrapper implements only a subset of the OpenGL spec. Slightly more surprising, however, is the fact that AltOGL also fails to allow the game to launch, instead resulting in an instant crash back to desktop. So, while Minecraft proper isn't able to run on the AT3D, I still would say that you could pull off some block game shenanigans with this thing if you're willing to suffer the pain of its rendering hardware screaming in agony.
Other games aside from Minecraft and ClassiCube were tested as well, or at least attempted to be tested, but much like Minecraft itself, they crashed in varying levels of severity with both wrappers. Aside from ClassiCube, the only thing I was successful in running were the 3DMark99 and 2000 benchmarks and demos.
But, this is not where this writeup ends. Oh, no. There's still a completely fresh, unopened can of worms sitting right here on the table for all of us to enjoy.
That can of worms? The card itself and its BIOS.
If you take a look at VGA Legacy MKIII's entry on the Alliance AT3D, you'll find that all of the cards shown on the website are made by a company called "Super Grace". All of them are identical.
My card, however, has a little extra something: a populated 10-pin header.
I'm not 100% sure about what its function is, but just from eyeballing where the traces lead to from the pin header suggests that this may be a header for some kind of optional TV-out add-on board. Perhaps one that outputs composite and/or s-video. It certainly fits in line with the strange video BIOS (vBIOS) that this card comes with. (Also, wow! Matching graphics core, PCB, and memory chip branding! And look at that neato peacock logo from an unknown company!)
So then. The BIOS. This card's BIOS is version 4.30.00 Build 29, whereas the version of BIOS on the Super Grace cards is 2.30.00 Build 29. The differences go beyond just a bump from a 2 to a 4, too; the "version 4" BIOS has a neat animated Alliance Semiconductor logo and banner that slides in from the right, whereas the "version 2" BIOS is a static text box with no logo or anything. However, the lack of animation does also allow the system to complete the bootup process much faster.
Below are a pair of videos demonstrating the difference:
youtube
youtube
Beyond the visual differences, the primary functional difference between the v2 and v4 BIOSes is that the v2 BIOS seems to be completely unaware of the card's seeming TV-out capability. The v4 BIOS, however, is practically hypersensitive to it.
I'll give you a rundown of what happened.
I often use a program called "VCS" to let my Datapath VisionRGB capture card pass video through and act more or less like a regular monitor. The problem, however, is that apparently, when the capture card is hooked up to the AT3D, this causes the EDID (Extended Display Identification Data) that the capture card sends to the AT3D to seemingly identify itself (to the AT3D, at least) as... a TV.
Now, normally this shouldn't be much if any issue. Graphics cards after all are meant to be able to handle this sort of connection. Heck, before the capture card, I was using an actual TV as the monitor for my testbench.
But because weird hardware seems to gravitate towards me, this was not the case for the AT3D. Whatever EDID info the capture card seems to send to the AT3D when VCS is used causes the card to trigger its "TV mode", which enables a TV features tab in the card's graphics settings and disables almost all of the regular settings for refresh rate and screen positioning available in the regular settings tab. Attempting to disable the TV features tab results in a system crash.
The solution to this? Going into the VisionRGB configuration, manually wiping the EDID information, and using OBS as a video-out from the capture card instead after rebooting both the capture system and the testbench. This finally gets the AT3D to recognize the capture card as a regular monitor rather than a TV, and makes the TV features tab go away and unlocks the regular monitor settings.
The v2 BIOS does not do any of this.
I really don't have anything else to say about this card. The AT3D alone is already known for being one of if not THE worst graphics card to ever exist, and the extra sprinkle of weird behavior that my specific card shows is more or less just icing on the terribleness cake. The lack of the ability to really do anything aside from purely-2D tasks makes it hard to have much that can be done with the card in the first place. Though, I guess Baldur's Gate 1 works pretty well since it's a sprite-based isometric 3D game that only uses 2D rendering techniques.
Anyway yeah this card is peak jank lol. Have my card's weird BIOS that doesn't exist anywhere else on the internet. The v2 BIOS is available from VGA Legacy MKIII.
Also the AT3D can run Tux Racer if you slap AltOGL on it. It crashes the whole system if you try and use the Techland wrapper, so the behavior is the polar opposite of ClassiCube 1.3.5.
Update: It appears that the Youtube RetroTechBytes also has a nearly-identical AT3D, peacock logo and all:
youtube
#youtube#techblog#not radioshack#my posts#writeup#Alliance AT3D#To boldly stumble… pushing the Alliance AT3D to its limits
0 notes
Text
Please signal boost: One-of-a-kind Broken Prototype S3 Chrome 460 Mega-infopost
(Yes, I know that is a very word salad-y post title.)
This is the S3 Chrome 460. It is currently nonworking. Nobody knows anything about it, and I’m trying to get in contact with any former S3 Graphics engineers that may be able to help.
What is known:
All “@” handles are in reference to users on Twitter.
As a whole, the Chrome 400 series utilized the Destination architecture, which was S3’s very first unified shader architecture capable of GPGPU tasks, much like Nvidia’s CUDA.
The Chrome 450/460 series utilized the mid-range “Destination 2” (D2) architecture, as opposed to “Destination 3” used by the low-end 430/440. As said by Loeschzwerg, there were no known high-end “Destination 1” architecture chip taped out. (https://twitter.com/Loeschzwerg_3DC/status/1683162950951903235)
The Chrome 450/460 was a 13x13mm (169mm2) die produced on the 90nm node by Fujitsu, and later TSMC. Strangely, the 430/440 series was produced on the much smaller 65nm node.
The Chrome 450/460 supported DX10, whereas the 430/440 supported DX10.1
The PCI device IDs for the Chrome 450/460 series are as follows:
o 9020, 9022, 9023, 9024, 9025 (http://listing.driveragent.com/c/pci/5333)
o The vendor ID for all chips is 5333 (S3 Graphics)
o There is one known beta driver that supports these device IDs (https://www.touslesdrivers.com/index.php?v_page=23&v_code=21175)
These PCI IDs appear to correspond to different die revisions
o Known existing die revisions present in collection: 86C920-921 (ID 9020), 86C922 (ID 9022 based on the cards owned by @Loeschzwerg_3DC), and 86C924 (can be assumed to be device ID 9024)
There are two “major” die revisions that exist.
o Revision 86C920-921 (I will refer to this as the “’920” revision) is most certainly the earliest revision, being physically larger in package size than all other revisions.
It also has a different pinout, and is produced by Fujitsu rather than TSMC, resulting in a greyish tint on the die rather than the blue tint of later revision chips.
The date code on all 86C920 chips (in my collection) reads “0727”, which implies that every ‘920 chip in existence was produced in a single batch on the 27th week of 2007, but cannot concretely prove so.
o Revision 86C922 (‘922) changes fabs from Fujitsu to TSMC, leaving the dies with a blueish tint and changing the pinout. This revision serves as the basis for all following revisions known to exist.
The date code on all 86C922 chips across my collection and Loeschzwerg’s appears to be 0738, implying that this revision was also produced in a single run much like its predecessor. (https://twitter.com/Loeschzwerg_3DC/status/1616750453500547072)
The 86C924 (‘924) revision is minor in comparison to the jump from ‘920 to ‘922, but interestingly the two in my collection have mismatching date codes—0728 and 0734. Unless I am reading the date codes wrong, this means that the ‘924 revision was produced either in two separate batches before the ‘922 run was done, or that perhaps it was continuously produced between the ‘920 and ‘922 revisions. I do not know what changes were made between ‘922 and ‘924.
o Almost all dies have a number and a letter, a word, or both written on them in permanent marker, likely for IDing which ones were being tested for what. The two words that appear the most often are “mem” (by far the most common) and “shift”. The letter “p” (pass?) also regularly appears.
There are three known Chrome 460 ES cards that exist: my broken early-revision ‘920 card, and Loeschzwerg’s two later-revision cards, one of which appears to be a 460 binned down to a 450.
There were rumors that the reason why the card was scrapped seemingly just before release was because its die became too expensive to produce (https://forum.beyond3d.com/threads/s3-excalibur-with-128-bit-mc-taped-out.42965/). There is zero confirmation or denial of this, and this rumor was said in passing on a thread discussing the follow-up Chrome 500 series (Excalibur architecture, which was later revealed to be a mildly tweaked at best Destination architecture).
The Chrome 460 (or at least, its ‘920 revision) gets hot very quickly, and may have been the reason for the switch to TSMC’s fabs.
o I believe that my card, despite having a heatsink and a fan blowing directly at it, likely died from overheating in the few minutes that it was running while I was collecting information and attempting to install an INF-modded 430/440 driver. It worked again after replacing the popped MOSFET and putting on the proper stock heatsink that it would’ve had mounted, but quickly died to a short a second time even with the new cooler. I still feel bad about this.
Most if not all Chrome 460 chips were acquired either directly through the S3/VIA/Centaur lab auction or through someone who attended the auction. The chips and card I own came from John at CPUShack, for example, and a number of others purchased miscellaneous 460s for their private collections from him.
o Loeschzwerg got his cards from a reseller who attended the 2021 Centaur auction just like I had.
And now, returning to my own broken card, to reiterate:
The card died to a short, and triggers the power supply’s OCP mode when the chip is installed on the board according to @dosdude1 . This does not happen when the chip is removed.
VCORE and GND were found by dosdude1 on the card’s footprint of the Chrome 460, and using that knowledge, the resistance from VCORE to GND on the chip itself was found:
o The original chip that was soldered to the card itself measured 1.5 ohms from VCORE to GND
o All other ‘920 chips between the ones in my collection and sent to dosdude1 as candidates for chip-swapping measure between 0.5 and 0.9 ohms, which I was told should be normal for graphics chips, but according to dosdude1 is still plenty low enough to trip the PSU’s OCP mode when installed.
Timeline
I originally purchased the card, alongside two “spare” ‘920 chips, two ‘922 chips, and two ‘924 chips, from CPUShack. The card had no heatsink included, so I had to make do with a chipset cooler (that honestly might have actually been better at cooling than the stock cooler) and a case fan.
o A few weeks after the purchase and arrival of the card, which by then had blown out its MOSFET and died short, I was able to buy a couple S3 Chrome 400-series coolers from CPUShack after he found them. Installing the cooler and replacing the MOSFET with an identical one salvaged off a Chrome 430 managed to get the card to work again for about half a minute from what I remember. However, I will also mention that the salvaged MOSFET came off a Chrome 430 that also died short. The MOSFET itself was undamaged, but the die itself appeared to be busted. I guess it was just an S3 thing during this generation.
o From what I remember, I filled out repair forms for both Louis Rossman and NorthRidgeFix, in hopes that a chip swap with one of the extra ‘920 dies I’d acquired at the time could bring the card back to life. In both cases I was turned down, though this was to be expected since there is zero documentation available for either the card or the chip itself.
o I was then able to get in contact with dosdude1, asking the same thing—just a chip swap was what I was hoping needed to be done. After some back and forth, I sent off the card, both of the extra ‘920 chips I had at the time, and some additional VRM MOSFETs that I’d bought after I’d run out of salvage chips.
o Shortly after the card and chips were mailed out, I came back to CPUShack asking if he had anymore Chrome 460 dies in stock. I proceeded to purchase all remaining chips he could find, which consisted of 10 ‘920 and 10 ‘922 revision dies.
o The chip swap would be completed with minimal issue, but the card unfortunately remained dead. It’s at this point where dosdude1 decided to probe for VCORE and GND, and in turn I measured out the resistance between the two on all ten of the ‘920 chips that had arrived by then, finding that they all read between 0.5 and 0.9 ohms. I do not know what the resistance from VCORE to GND is on the later die revisions, as the pinout was changed.
o Currently, dosdude1 is waiting for a buck regulator replacement to arrive after replacing the MOSFETs themselves yet again, finding in the process that the replacement MOSFETs I sent were faulty and… sending 12 volts straight into VCORE. That’s on me for buying them off Ebay. I did not know where else to look.
o Despite this, even after the faulty MOSFETs were replaced with tested working ones, the card continued to pull far too much current.
o Finally, as of now, he is still waiting for the buck regulator to arrive. There is no evidence that it is faulty as opposed to the MOSFETs, but neither of us have any clue what could be wrong anymore at this point aside from the dies themselves. It’s very much possible that one of the three dies in his possession have been fried to a crisp by that momentary 12v to VCORE in the split second before OCP kicked in, but even then after a non-faulty replacement was installed alongside a spare die it continues to trip the power supply’s OCP.
o I own 27 Chrome 460 dies across three revisions, 13 of which, counting the two sent to dosdude1 and the one installed on the card itself, are of the ‘920 revision that’s been causing myself and dosdude1 headaches for the better part of a year. Neither of us know what is wrong with the card, and outside of the discovery of the faulty “new” MOSFETs I’d sent, the only thing that appears to still be the issue are the dies themselves. They can’t all be bad, right?
Misc. info
My only lead on anyone that may be familiar with the S3 Chrome 460 is Bruce Chang of VIA Technologies. However, his email appears to only accept messages from others in the company.
Additional Photos
#not radioshack#techblog#youtube#my posts#S3 Chrome#S3 Chrome 460#prototype#prototype hardware#technology#prototype technology#graphics card#unreleased technology#tech blog#call for help#please signal boost
1 note
·
View note
Text
Fixing an AT3D. 👍
1 note
·
View note
Text
Writeup: Forcing Minecraft to play on a Trident Blade 3D.
The first official companion writeup to a video I've put out!
youtube
So. Uh, yeah. Trident Blade 3D. If you've seen the video already, it's... not good. Especially in OpenGL.
Let's kick things off with a quick rundown of the specs of the card, according to AIDA64:
Trident Blade 3D - specs
Year released: 1999
Core: 3Dimage 9880, 0.25um (250nm) manufacturing node, 110MHz
Driver version: 4.12.01.2229
Interface: AGP 2x @ 1x speed (wouldn't go above 1x despite driver and BIOS support)
PCI device ID: 1023-9880 / 1023-9880 (Rev 3A)
Mem clock: 110MHz real/effective
Mem bus/type: 8MB 64-bit SDRAM, 880MB/s bandwidth
ROPs/TMUs/Vertex Shaders/Pixel Shaders/T&L hardware: 1/1/0/0/No
DirectX support: DirectX 6
OpenGL support: - 100% (native) OpenGL 1.1 compliant - 25% (native) OpenGL 1.2 compliant - 0% compliant beyond OpenGL 1.2 - Vendor string:
Vendor : Trident Renderer : Blade 3D Version : 1.1.0
And as for the rest of the system:
Windows 98 SE w/KernelEX 2019 updates installed
ECS K7VTA3 3.x
AMD Athlon XP 1900+ @ 1466MHz
512MB DDR PC3200 (single stick of OCZ OCZ400512P3) 3.0-4-4-8 (CL-RCD-RP-RAS)
Hitachi Travelstar DK23AA-51 4200RPM 5GB HDD
IDK what that CPU cooler is but it does the job pretty well
And now, with specs done and out of the way, my notes!
As mentioned earlier, the Trident Blade 3D is mind-numbingly slow when it comes to OpenGL. As in, to the point where at least natively during actual gameplay (Minecraft, because I can), it is absolutely beaten to a pulp using AltOGL, an OpenGL-to-Direct3D6 "wrapper" that translates OpenGL API calls to DirectX ones.
Normally, it can be expected that performance using the wrapper is about equal to native OpenGL, give or take some fps depending on driver optimization, but this card?
The Blade 3D may as well be better off like the S3 ViRGE by having no OpenGL ICD shipped in any driver release, period.
For the purposes of this writeup, I will stick to a very specific version of Minecraft: in-20091223-1459, the very first version of what would soon become Minecraft's "Indev" phase, though this version notably lacks any survival features and aside from the MD3 models present, is indistinguishable from previous versions of Classic. All settings are at their absolute minimum, and the window size is left at default, with a desktop resolution of 1024x768 and 16-bit color depth.
(Also the 1.5-era launcher I use is incapable of launching anything older than this version anyway)
Though known to be unstable (as seen in the full video), gameplay in Minecraft Classic using AltOGL reaches a steady 15 fps, nearly triple that of the native OpenGL ICD that ships with Trident's drivers the card. AltOGL also is known to often have issues with fog rendering on older cards, and the Blade 3D is no exception... though, I believe it may be far more preferable to have no working fog than... well, whatever the heck the Blade 3D is trying to do with its native ICD.
See for yourself: (don't mind the weirdness at the very beginning. OBS had a couple of hiccups)
youtube
youtube
Later versions of Minecraft were also tested, where I found that the Trident Blade 3D follows the same, as I call them, "version boundaries" as the SiS 315(E) and the ATi Rage 128, both of which being cards that easily run circles around the Blade 3D.
Version ranges mentioned are inclusive of their endpoints.
Infdev 1.136 (inf-20100627) through Beta b1.5_01 exhibit world-load crashes on both the SiS 315(E) and Trident Blade 3D.
Alpha a1.0.4 through Beta b1.3_01/PC-Gamer demo crash on the title screen due to the animated "falling blocks"-style Minecraft logo on both the ATi Rage 128 and Trident Blade 3D.
All the bugginess of two much better cards, and none of the performance that came with those bugs.
Interestingly, versions even up to and including Minecraft release 1.5.2 are able to launch to the main menu, though by then the already-terrible lag present in all prior versions of the game when run on the Blade 3D make it practically impossible to even press the necessary buttons to load into a world in the first place. Though this card is running in AGP 1x mode, I sincerely doubt that running it at its supposedly-supported 2x mode would bring much if any meaningful performance increase.
Lastly, ClassiCube. ClassiCube is a completely open-source reimplementation of Minecraft Classic in C, which allows it to bypass the overhead normally associated with Java's VM platform. However, this does not grant it any escape from the black hole of performance that is the Trident Blade 3D's OpenGL ICD. Not only this, but oddly, the red and blue color channels appear to be switched by the Blade 3D, resulting in a very strange looking game that chugs along at single-digits. As for the game's DirectX-compatible version, the requirement of DirectX 9 support locks out any chance for the Blade 3D to run ClassiCube with any semblance of performance. Also AltOGL is known to crash ClassiCube so hard that a power cycle is required.
Interestingly, a solid half of the accelerated pixel formats supported by the Blade 3D, according to the utility GLInfo, are "render to bitmap" modes, which I'm told is a "render to texture" feature that normally isn't seen on cards as old as the Blade 3D. Or in fact, at least in my experience, any cards outside of the Blade 3D. I've searched through my saved GLInfo reports across many different cards, only to find each one supporting the usual "render to window" pixel format.
And with that, for now, this is the end of the very first post-video writeup on this blog. Thank you for reading if you've made it this far.
I leave you with this delightfully-crunchy clip of the card's native OpenGL ICD running in 256-color mode, which fixes the rendering problems but... uh, yeah. It's a supported accelerated pixel format, but "accelerated" is a stretch like none other. 32-bit color is supported as well, but it performs about identically to the 8-bit color mode--that is, even worse than 16-bit color performs.
At least it fixes the rendering issues I guess.
youtube
youtube
#youtube#techblog#not radioshack#my posts#writeup#Forcing Minecraft to play on a Trident Blade 3D#Trident Blade 3D#Trident Blade 3D 9880
2 notes
·
View notes
Text
I really don't understand why people reblog my pinned post. It's just supposed to be a decoration for my blog theming, not really good content I felt like people would wanna share. Like, more power to you, and thank you for the support... but like..... why??
5 notes
·
View notes
Text
Might start actually using this blog for writeups on hardware experiments and stuff btw
0 notes
Text
If you are or know a former S3 Graphics engineer that has worked on this card, feel free to contact me! (I'm both @not-radioshack and @themajortechie).
Otherwise, share this around if you can!
A call to help on repairing a one-of-a-kind prototype!
youtube
2 notes
·
View notes
Photo
Panasonic KX-T1520 Easa-Phone Answering Machine (Japan, 1970s)
779 notes
·
View notes
Text
REALLY liking how these two photos turned out. I'll probably be sending a couple of these off to dosdude1 to (hopefully) repair my S3 Chrome 460 prototype soon, but the rest of these are either reading with extremely low resistances (a possible indication of the chip being faulty) or are a newer revision that is physically smaller and has a different pinout.
1 note
·
View note
Text
Guess who's hosting #GPUJune3 this year?
@themajortechie yeet
It's an honor for PixelPipes to allow me to host this event when just a year ago I was only a participant.
youtube
Rules of participation
As far as graphics cards go, the stranger the better! Modern graphics cards are allowed as well as iGPUs. Please keep in mind however that the point of this event is to shine light on older or more uncommon graphics processors. :)
Like previous GPUJune events, videos must be tagged with #GPUJune3 somewhere in the video title and/or description so that it can be more easily found. A (very simple) Google Forms link is provided below to sign up for the event and have your videos featured in the official playlist.
Submit your YT channel through the Google Forms link below, and then check out some neat Discord servers! https://forms.gle/QExJPTT4HU649GYo9
GPUJune logos for use in thumbnails: https://drive.google.com/drive/folders/10azryRnymqtwehVPdLRHOBnIgd6RHB1d?usp=sharing
Unlike previous GPUJune events, there is no associated website this year. The official playlist linked below: https://youtube.com/playlist?list=PLnHilFIYvt5Ow_9WhHb8RMYtPIPsExcnx
#GPUJune#GPUJune3#GPUJune 3#technology#PixelPipes#TheMajorTechie#youtube#youtube event#techblog#techblr#graphics card#GPU#Youtube
1 note
·
View note
Text
33K notes
·
View notes
Text
The S3 ViRGE Minecraft Thing
(Article originally posted on TheRetroWeb)
Have you ever wanted to play Minecraft? Have you ever wondered “how terrible can I make this experience”? No? Well too bad. You’ve clicked on this article and I’ve gone this far already, so let’s just keep going and see what happens.
A Little bit of history…
The S3 ViRGE, short for Video and Rendering Graphics Engine (alternately, Virtual Reality Graphics Engine), was first introduced in November of 1995, with an actual release date of early 1996. It was S3 Graphics’ very first 3D-capable graphics card, and it had the unfortunate luck of launching alongside… the 3Dfx Voodoo 1.
It became rather quickly apparent that the ViRGE was terribly insufficient in comparison to the Voodoo, and in fact even picked up the infamous moniker of “graphics decelerator”, which poked fun at its lackluster 3D performance.
The original ViRGE would be followed by the card that this article focuses on, the ViRGE/DX, just a little under a year later in the waning months of 1996.
The ViRGE/DX was a welcome improvement over the original release, lifting performance to more acceptable levels and improving software compatibility with better drivers. Mostly. And while native Direct3D performance was iffy at best and OpenGL support was nonexistent, S3 did have one last trick up their sleeves to keep the ViRGE line relevant: the S3D Toolkit.
Similar to 3Dfx’s Glide API, the S3D Toolkit was S3 Graphics’ proprietary low-level graphics API for the ViRGE. Unlike 3Dfx’s offering, however, S3D, much like the cards it was intended for, fell flat on its face. Only a small handful of games ever natively supported S3D acceleration, and by my own admission, I haven’t ever played any of them.
But wait, this article is about playing Minecraft on the ViRGE, isn’t it? The block game of all time is famously written in Java, and uses an OpenGL rendering pipeline. So, how can the S3 ViRGE, a card with no OpenGL support, possibly play Minecraft?
Wrappers!
This is where a little thing called “OpenGL wrappers” come in. Shipping in the form of plain OpenGL32.dll files (at least, on Windows) that you drop into a folder alongside whatever needs OpenGL acceleration, these wrappers provide a way to modify, or “wrap”, OpenGL API calls.
In our case, we are interested in the category of OpenGL wrappers that translate OpenGL API calls to that of other APIs. For a more modern equivalent of these wrappers, the Intel Arc line of graphics cards uses DXVK in order to translate older DirectX 9 calls to Vulkan, which is a natively-supported API.
For this experiment, we will be using a wrapper called “S3Mesa”, made by Brian Paul of the Mesa project. Though open-source, this wrapper never made it to a completed state, and is missing certain features such as texture transparency despite the ViRGE itself being supposedly capable of it. However, this does not affect gameplay much beyond aesthetics.
The S3Mesa wrapper, on a more technical note, translates OpenGL 1.1 calls to a mix of both S3D and DirectX API calls.
The System
At last, we arrive at the system hardware. As of writing, I am currently benchmarking a plethora of low-end (or otherwise infamous) cards for my “Ultra Nugget Graphics Card Roundup”, and so the system itself is likely a liiiiiittle bit overpowered for the lowly ViRGE/DX:
AMD Athlon XP (Palomino) @ 1.14GHz
Shuttle MK32 Socket A motherboard
256MB DDR-400
S3 ViRGE/DX (upgraded to 4MB of video memory)
Windows 98SE
Why Windows 98SE? Because S3 never released 3D-accelerated graphics drivers for non-Windows 9x operating systems in the consumer space.
For Minecraft itself, KernelEX 4.5.2 and Java 6 are installed as well, and an old version of the launcher dating back to early 2013 that I personally refer to as the “Minecraft 1.5 Launcher” is used for compatibility purposes. Also because no launcher that can work on Windows 98 is capable of logging into the authentication servers anymore.
Setting up the game
With Windows 98SE, KernelEX, and Java 6 installed (in that order, of course), we can turn our attention to the game itself. As mentioned before, no launcher to my knowledge that runs on Windows 98 is capable of logging into the auth servers. This results in two additional problems: starting the game itself and downloading game assets.
Using the 1.5 launcher solves this first issue by means of relying on a little thing called the lastlogin file. This is an old way that the launcher was able to allow players to keep playing offline when disconnected from the internet, but more importantly, unlike the modern launcher, it doesn’t expire. 🙂
And because of that, our login problem is solved by middle school me’s old .minecraft folder backup, from which I’ve extracted the lastlogin file for use in this experiment.
As for game assets, there is no longer any way to easily download the game files for use on Windows 98SE directly, and so I’ve instead pieced together a folder using that same backup. The most important thing is that instead there being a “versions” folder, there is now instead a “bin” folder, where both the natives and the game’s jarfile both reside.
Now that our .minecraft folder is acquired, take that thing and plot it right down into the Windows folder in Windows 98. Why? Because on Windows 98, the 1.5 launcher ignores the “application data” folder entirely. The launcher itself can go anywhere you’d like, so long as you’re using the .exe version and not the .jar version.
Finally, to wrap things up, place the OpenGL to S3D wrapper in the same location as the launcher exe. Make sure it’s called OpenGL32.dll!
The Game
You just lost it. 🙂
The S3 ViRGE, by my own testing, is capable of running any version of Minecraft from Classic up to and including Indev version in-20100110. However, it is EXTREMELY unstable, and has a tendency to crash mere seconds after loading into a world. This is on top of some minor rendering errors introduced by the aformentioned incomplete state of the S3Mesa wrapper. This video was recorded with Windows ME rather than Windows 98, but this does not impact anything regarding performance or compatibility (and in fact, at least from my own experience, the game is more stable under ME than 98).
Below are the desktop/game settings used in testing:
“Tiny” render distance
Desktop resolution: 640 x 480 (don’t fullscreen the game)
Bit depth: 16/24-bit color (32-bit can cause the ViRGE to run out of memory, and 16-bit can cause strange issues on Windows 98)
And last but not least, some gameplay. This came from some scrapped footage originally intended for my “UNGCR” video, and was only intended for personal reference in collecting performance numbers. As such, the audio is muted due to some copyrighted music blasting in the background.
youtube
Further reading/resources
Vogons Wrapper Project
Original video that this article is based on
VGA Legacy MKIII’s ViRGE/DX page
thanks for reading my walking natural disaster of an article kthxbaiiiiiiii
14 notes
·
View notes