How I resurrected 25-year-old spectrometers

November 24, 2025 /

> Github repo 

> Want access to our solar spectral data? Email us at brian.carlsen@ipv.uni-stuttgart.de or sekretariat@ipv.uni-stuttgart.de

> Check out the data pipeline that uses this on Github.

When I started this project I thought it would be a pretty mundane few hours of work, so I didn’t think to keep a record of anything. Now, in hind sight, both of those thoughts were mistakes and I’ve had to recreate this story as best as possible from memory. Because of that, some details are likely inaccurate and emotions embellished. 

During my PhD, one of my first projects was to run the data analysis of simulated real world degradation experiments. It’s a favorite project of mine, and kicked off an interest I’m still pursuing: understanding the contributions of environmental factors towards perovskite degradation. How much does humidity, temperature, light cycling, and the rest of the environmental factors a perovskite solar cell is exposed to affect their degradation?  During my PhD defense, Prof. Dil asked why I didn’t use a Bayesian network to classify these contributions on the data I had? Under the pressure of the exam, I didn’t have a good answer, but that comment stuck with me.

Within the first week of arriving at ipv I received three tours of the facilities, but the last one, with Prof. Saliba himself, who was appointed as ipv Director from 2020, included a little extra bit – the rooftop. I arrived in Stuttgart in January, but luckily on this day the weather was decent, so he took me up there to show me the solar park… and the view of the city. There are four or five racks of solar modules that have been constantly collecting data for two decades, all available for us to use for research, the only such testing facility at Stuttgart University. A perfect opportunity for me to continue my degradation investigation.

Prof. Saliba told me to get in touch with Navid to access the data. The next day Navid took me back up on the roof to explain the technical details of the solar park. He showed me the different types of modules that were installed, the data collection boxes he designed, and the rest of the infrastructure, which looked like it hadn’t been updated since the park was first built in the 90’s. This was when the ipv was still called the ipe and focused on CIGS, silicon and other generation I and II solar cells. At the end of the tour Navid opened up the rooftop office. It was full of old equipment and tangled wires. Navid explained what each piece of equipment did, at least the ones he knew about, and then pointed to two spectrometers: “It would be great to get these running again, but the computer is too full and it’s the only one they work with.” That should be an easy fix, I thought, so I told him I’d give it a go. We traced through the nest of wires eventually disconnecting the spectrometers and 90’s style computer setup – including a Windows Intellimouse – from the rest of the equipment and hauled it down to my office.

This will be quick… right?

The ultimate goal is to measure the solar spectrum in various weather conditions and correlate that to both the performance and degradation of our solar modules, using the data as input into AI prediction models. To make this happen, I had three goals for the spectrometers:

  1. Get them working
  2. Control them with a Python script
  3. Stream their data into our database
Cleaning the computer

Navid had mentioned the computer running the spectrometers didn’t have enough disk space to capture any more spectra, so my first step was to archive all the data currently on the computer (we’re scientists so we never delete data) then remove it all to make way for new measurements. I started up the computer and it flashed the good old Windows XP logo. Not technically a 90’s computer (Windows XP was released in 2001), but I was close. I wanted to store the data on the group’s shared drive, so I plugged in an ethernet cable and tried to drag-drop the computer’s data folder into it. And… we hit our first road bump. 

The shared drive is 8 TB, 7.5 of which are full, and the old data was about 50 GB. While there was enough space, I also didn’t want to take up 10% of our free space left with this old data. I browsed through the data a bit, and all the files were the same shape, just a list of numbers, so I should be able to get a pretty good compression ratio and save a massive amount of storage. I right clicked on the data folder and looked for the Compress to… menu item. Not there. Right… Windows XP, so we don’t have all these luxury features we’re used to like built-in zip compression. I checked for which compression tools were available for Windows XP and good ol’ reliable 7-Zip is compatible. I threw it on a USB, moved it to the computer, and gave it a go it. 7-Zip started brr-ing away. I let it run for the rest of the day and would check it in the morning. 

Morning arrived, the birds were singing (not really, it was winter here in Germany) and I was ready to format the drive and get the spectrometers running. When I opened the computer to check I was greeted by a nice red error message. 7-Zip could not complete the zip because the folder contained too much data. Files on FAT32 systems have a limit of 4 GB, and we’re compressing 50 GB. Okay, not a big issue, the main data folder has three subfolders, we can just zip each of them individually. Nope, this won’t work either, each of them is too big, and one more level down are hundreds of folders, which I’m not going to zip manually.

This FAT32 file limit only exists on the XP machine, though. What if I compress the files directly to the shared drive? When I open up 7-Zip, though, there’s no way to specify the output directory. Something tells me there must be a way to do this though, which is how I discovered the 7-Zip CLI. It provides a -o flag that specifies the output directory. After some small tests I set it to task and let it run overnight.

When I returned in the morning I checked the shared drive and all the data was there! Now I could just delete the data and test the spectrometers. I selected the data folder, hit delete, and up popped an error. The folder required admin rights to be removed. I asked Navid if he knew the admin password. He didn’t, but knew the original maintainer of the solar park and said he would reach out to him to see if he had. After a few days the old maintainer replied, but didn’t remember the admin password either. I gave a few half-hearted guesses to see if I could crack it myself, but nothing worked and I didn’t feel that brute forcing the password was worth the effort.

This wasn’t a huge deal, though, I could just format the disk to start from a blank slate and reinstall Windows XP. I did a search on how to install Windows XP and was again reminded that this computer is from the ‘00s. Windows XP requires a CD to reinstall.  (For those of you that don’t know, a CD is like a shiny frisbee us old timers used to store things on. Things like music and operating systems.) It wouldn’t even be worth trying to find the original CD that came with the computer, but I can always download an ISO. I would still need the product key, though.  (For those of you that don’t know, a product key is like a promo code attached to the CD case that unlocks the software on it.) After some more searching I learned that the product key is stored in the registry, so I can grab it before I format the disk.

I followed an online tutorial (which I can no longer find the link to) on how to extract the product key and after some playing around, extracted something that looks like a Windows product key. After writing it down in multiple places out of paranoia, I was ready to format the disk. Without admin rights, though, I couldn’t format the disk in the original computer. I would need to plug it into a computer I do have admin rights on. 

When I opened up the chassis I was greeted by 25 years of dust. That little pinch of OCD in me begged to clean all the internals, so I gave in. I wiped down the cables, fans, power supplies, and boards, basting them with some compressed air to finish off the job. With the computer physically cleaned, it was time to digitally clean it. I didn’t want to run anything on my main computer in case I blew something up, so I grabbed a spare, took out the HDD from the original computer and plugged it in. I opened up the Format drive menu, clicked Format, and we were back to a blank slate. All I had to do was load Windows XP back onto the disk and we were golden.

XP installs in two steps. First you need to configure it, then run the actual install. I plugged the drive back into the original computer, loaded the ISO onto a USB and booted from it, running through the configuration options. After a restart, the install process began, and I gave it a few hours to complete. When I checked back though, the process was hanging saying the disk couldn't be found. I tried again with some different configuration options, but still no luck. I plugged the disk back into the spare computer, reformatted it again, and tried to install. It still wasn’t working.  At this point I started playing musical chairs, popping the disk in and out of multiple computers, reformatting, trying to reinstall, but no luck. Finally, I plugged the disk back into the original computer and booted into the BIOS to see if anything looked odd. The disk was set up in a RAID master/slave configuration. Maybe that is what was causing the issue. I changed it to be in a single disk configuration and tried the reinstall process once more. That didn’t work either, so I set it back to the original settings and tried again. This time though the original computer didn’t even find the disk. I fried it.

So after all that, it appeared that getting the old computer running wasn’t going to work at all. Step 1: Failed.

I could try a new strategy though. I searched for another machine running 32-bit Windows. While we didn’t have any 32-bit machines, we happened to have a single machine with 64-bit Windows 7 installed. I wanted to test if the spectrometers would even run on this computer, so I reached out to the StellarNet team.

DLL and seek

The StellarNet team responded the next day letting me know the spectrometers we have – the EPP2000 UV-Vis and EPP2000 NIR-InGaAS – were discontinued and I should consider getting new ones. I agreed with them, but the project these spectrometers are meant for doesn’t have the budget for that. I wanted to at least exhaust all my options before having to shuffle money around to purchase new ones.

I was routed to David who sent me the original SpectraWiz software that is meant to interact with the spectrometers. Installing the software took a bit of effort, but after a few minutes it was up and running and the spectra were live updating! Now that I knew the spectrometers were working with the new computer, I wanted to get programmatic control.

Along with the SpectraWiz software, David sent some C++ demo code for controlling the spectrometers. He only sent me the source code though, so I needed to compile it myself. When I first started programming, around the time the spectrometers were built, I used C++ on Linux. It had been a while since I’ve had to deal with C++ though, so I knew I was going to be rusty.

I checked to see if I could get gcc for Windows and with MinGW it’s easy. Sometimes the luxuries of modernity are quite nice. I tried compiling the example project with the basic gcc command, but nothing worked, so I started to fiddle with some flags. I could get the program to compile but not to run. After some debugging I figured out that the StellarNet DLL used to communicate with the spectrometers wasn’t loading. I messaged David to see if he had any insight and he informed me that I have to use 32-bit compatibility mode to compile the programs. So I slapped the -m32 flag into my gcc command, but it still wasn’t working. I thought there may be something wrong with gcc, so give it a shot with clang, but still couldn’t get it to work. Maybe I needed to use a native Windows compiler? The main way to do this is using Visual Studio (not VS Code, but the original VS). I downloaded the VS 2022 Community Edition and after waiting through the 45 minute installation, was greeted by the error that it is not compatible with Windows 7. I had the option of using MSVC, which is the compiler built into VS, but felt it would be a long, windy road not worth going down.

David had told me that I wouldn’t be able to run the programs on a new (meaning newer than Windows 7) machine, but at this point I felt like I had run out of options, so may as well try. The SpectraWiz software definitely wouldn’t run on my Windows 10 machine, but perhaps I could get the demo code David had sent me compiled and running.

The demo was a GUI application, but all I wanted to do was get the StellarNet DLL loaded. It was a small app so after browsing the source code for a few minutes I found the section they used to load it. I copy-pasted it into a new app to isolate it. It compiled, but nothing happened when I ran it. I started with some quick and dirty print debugging but my machine’s system language is German, so I couldn’t understand any of the OS errors. I sat there with Google Translate open on my phone taking a picture of each error that popped up to move forward. Eventually, taking about ten times longer than expected, I figured out that the DLL wasn’t loading properly, the same error as before. I thought there may be something wrong with the linking stage during compilation so decided to copy all the compilation settings from the demo project. This is how I found out Visual Studio only allows you to have one project open at a time. To copy the settings I had to open the demo project, copy a few lines, then transfer them over to the new project. I flipped back and forth between the two until I checked that – the about 100 – settings matched, even though I only needed to change about five of them. I ran it again, but still no luck.

At this point I was convinced there was something wrong when loading the StellarNet DLL, even though I had seen it work while running the demo app. I read more about the LoadLibrary function that loads the DLL, and found that I could also use LoadLibraryEx with flags. The first flag on the list is DONT_RESOLVE_DLL_REFERENCES which doesn’t load any external DLLs loaded by the one being loaded. The docs say explicitly not to use this flag as it is only provided for backwards compatibility, but I’m dealing with legacy code, so I thought “maybe that’s a good thing”. It worked! The StellarNet DLL loaded. I excitedly tried the two other flags – LOAD_LIBRARY_AS_DATAFILE_EXCLUSIVE and LOAD_LIBRARY_AS_DATAFILE_EXCLUSIVE – which treat the DLL as a data file, but both failed with a “proc address” error. So it definitely seemed the StellarNet DLL was trying to load another DLL, but was failing to find it.

I reached back out to David with this info and he asked me if I had tried running the demo he sent me. Of course I hadn’t, I can’t even get the DLL to load correctly, so how would the GUI application run? I gave it a try anyways just so I could smugly respond “I gave it a try but it didn’t work.” To my surprise though, it worked… sort of! I still didn’t get any spectra, but I got a new error: “USBDRVD.DLL is missing”. Could this be the external DLL the StellarNet DLL couldn’t load? I Googled “USBDRVD.DLL” but almost no results came up and there was only a single promising lead, a GitHub repo. I checked the source and it included a file with the correct name. I downloaded it into the project directory and tried the demo app again. "Error 0xc000007b application couldn't start correctly" which seemed to indicate that there are missing or incompatible files. We had found our culprit!

I emailed David to see if he knew anything about it. He asked me if I had been running the programs from within the SpectraWiz program folder. I hadn’t. I had only run the SectraWiz software on the older machines from there, but on the new machine I had been running the demo project and my attempts from Documents. I installed SpectraWiz on my machine, opened its program folder, and there it was in all its shining glory – USBDRVD.DLL. I copied it into my basic DLL loading test program, took a breath, and ran it. A pop up appeared: DLL loaded successfully! All that trouble, ten weeks of debugging and frustration, all because of this little file.

I quickly wrote a test program to ensure I could load and run all the functions within the StellarNet DLL, and it all seemed to work. 

Goal 1: complete!

A snaking path

After running a few more small tests to convince myself everything was working, it was time to get the spectrometers controlled with Python. The problem was that the StellarNet DLL is 32-bit, so any program loading it needs to be 32-bit. Python, on the other hand, runs with the machine’s native word size. We don’t have any more 32-bit machines – I fired the hard disk of the only one we had – plus needing to have one just for these spectrometers isn’t convenient or maintainable. I needed to get it running on a modern 64-bit machine.

After reading some forums and pondering about the problem with my head in hands, it seemed the best (only?) path forward was to create

  1. A 32-bit “server” program in C++ that loads the StellarNet DLL
  2. A 64-bit “client” program in C++ that launches the server and communicates with it via message passing.
  3. A Python package that gives API access to control the spectrometers by launching the client program and communicating with it via Python’s ctypes.

Luckily, I already had experience in this sort of programming from other projects. I used message passing extensively, via the Actor model, in Syre, and integrated Python with a DLL using ctypes when I developed the easy-biologic package. There wasn’t much documentation for the StellarNet DLL, so I interpreted the main functionality from the demo program and stubbed out the rest of the functions I didn’t understand. Because I hadn’t programmed C++ in 15 years, and didn’t intend to again, I figured it could be a good opportunity to play around with an AI code agent for the first time. I do all my daily programming in Rust and Python (did I mention I use Rust, by the way?), and have found the AI agents more obtrusive than helpful for me so hadn’t really used one before. Visual Studio has GitHub Copilot built in, so I enabled it. Most of the code completion was trash, but whenever I needed to search something I would ask Copilot instead, and I have to say using it as a fancy form of Google sped up the development process significantly. 

After getting my C++ legs back under me and getting some of the more basic functions running, it was time to get an actual spectrum. From what I understood, the spectrometers take measurements continuously. So, to get a spectrum off of them you need to poll it and grab the data when it’s ready. I first tried doing this through Python calling the scan_c API function which calls the SWDscanLV DLL function, but kept receiving a timeout error. I figured that running the poll from Python was too slow, so moved that functionality to the server program and it worked perfectly!

I can't tell you the relief I felt when the first real spectrum came through via my Python script. I kept telling my officemate how big of a genius I am. I got these two 25 year old spectrometers running on modern machines through Python 3.13, something I don't even think David and the StellarNet team thought was possible.

Goal 2: complete!

A stream of data

The final task was to actually create a program to store these spectra for use. Our solar park has about 75 solar modules of various sorts operating at any time. Most of them are outfitted with a sensor array built by Navid. It tracks the instantaneous voltage and current produced by the module, as well as the panel temperature. We also have ambient sensors that collect solar irradiation, humidity, and ambient temperature data. To get this running wasn’t too bad. The main thing I wanted to optimize for is to only store meaningful data. If the spectrum doesn’t change in any significant way for an hour, there’s no reason to store multiple spectra during that time. There also isn’t any reason to store spectra at night.

I made a little Python script that first checks if some minimum time has passed – in our case 1 minute – since the last spectrum was stored. If so, then I use the API I made into the StellarNet DLL – packaged as stellarnet-legacy – to collect a spectrum. I then check whether the spectrum is worth storing based on three conditions:

  1. Is there enough signal in the spectrum to be worth saving? This is calculated by summing the total counts in the spectra and checking whether it is greater than some minimum threshold.
  2. Has it been too long since the last spectrum was stored? In our case this threshold is set to 10 minutes.
  3. Has the spectrum changed significantly from the previously stored one? This is calculated by taking the difference between the current and previous spectra, summing the count difference, and normalizing to the total counts of the last stored spectra. If that surpasses some minimum threshold – in our case 0.1 – the spectrum is stored.

To store the spectrum it is saved as a CSV file in an Amazon AWS S3 bucket. The S3 object key (the unique id for the file) is then registered in our InfluxDb database. This allows us to easily pull the spectra associated with any of our other measurements to see if the spectrum has any effect.

Goal 3: complete!

Diagram of the spectra data pipeline.
Diagram of the spectra data pipeline.

What’s next

If you remember way back at the beginning of this whole post, the ultimate goal here is to train machine learning models over this data to investigate

  1. How well can we predict a module’s electrical characteristics based on its history and the current environmental conditions? and
  2. Can we use that model to understand how environmental factors impact module degradation?

We have several students working on both these routes, and are excited to share our findings with you! Be sure to keep an eye out for some of our upcoming papers on the topic.

We are also happy to collaborate with you and give you access to our data. Don’t hesitate to reach out with any interesting ideas you have so we can get even more interesting projects rolling!

Thank yous

Thanks so much to David and the StellarNet team for their time and patience during this whole ordeal. I've been dealing with many scientific companies the past few months trying to bring back some old equipment back to life, and most of the larger companies just ignore you unless you're spending new money with them. It was greatly refreshing to work with a company that actually cares about their customers and the products they produce, even ones that are older than most of the PhDs in our group.

To the top of the page