You are here

LabVIEW Trial and Error (Week 3)

Monday: Today I got the chance to take a trip to the Columbia River to collect samples by the Beaver Army Terminal near Longview, Washington. We also drove to Vancouver Lake to get samples from there. We left at 9am and got back around 1:30pm, so it was a nice little break from being stuck inside. Then I worked on my LabVIEW program. LabVIEW is on all of the computers in the lab now so I can start building my program without having to worry about which computer can handle what. I researched different example VI’s and read some forums from National Instruments to see how people tackled similar problems.
Tuesday: Today I started making my program more complex. I inserted commands to find the area under the curve (integration) and started to build the commands for the peak detection. I got used to how these programs operated and what outputs they were giving. It is still a bit unclear as to what exactly is being calculated corresponding to our signal input. I played with the different control inputs to see how the signals reacted when I changed certain parameters, such as threshold, number of samples, wait time, and width.

Fourth of July! No work today. Today, Marc and I went and explored Portland a bit. We went to the Hoyt Arboretum at Washington Park and ended up walking down to the International Rose Test Gardens. Then we went downtown and met with some of the other interns to go to dinner and watch the fireworks at the waterfront. We were going to go to the food carts for dinner, but they were closed. It was lots of fun.

I tweaked my program some more today. I discovered that there was another icon for integrating that performed the integration point by point. This icon is a bit more defined as to what it is actually calculating. Now I know that it is calculating the area under each peak. I inserted and wired that icon into the program. Also, the peak detection is getting its input data from a random number that it inputted into a sine wave. It isn’t taking our signal from the microflow cytometer. Although it is the wrong data that it is detecting, it is at least detecting something and producing some results. Right now I am working on the program with the 10 um beads. After one program is made, however, it can be applied to all samples by simple changing the filename location. Eventually we will need to store the data that it calculates for later analyzing. This is most easily done by exporting it to an excel file. When I applied this command, I wasn’t sure which parameters the excel sheet was going to give me, what the data would even look like, or how much data there was going to be. As it turned out, it created a new excel sheet with some parameters for each iteration that it performed going through the while loop. It performs smaller iterations of samples and repeats the iterations until the sample is done and there is no more data to process. In the end, I had over 500 excel spreadsheets open on my computer and it soon became overloaded. The computer had reached its max capacity of documents and my program was still running trying to export even more data. My computer was frozen for a long time, but I eventually deleted all of the excel files. I learned my lesson on that one—I don’t think that I will be doing that again. I also found 5um and 20um latex beads on the web and placed an order for those. The plan for next week is to run the beads and some of the live cultured phytoplankton (Thalassiosira pseudonana, T. weissflogii, and T. rotula) through the µFCM and collect that data.

Joe Needoba (my senior scientist) came in this morning to see the progress on the program and to provide help. He too was confused by the sine graph and all of the different outputs. We decided to write a section of the program of our own to detect peaks and wire it to the input signal. After some trial and error, I think that it is getting to a point where it can start being useful in some aspects. Today, we also had another “brown bag seminar” with Veronika Megler who is working on her PhD here at OHSU. She has had a pretty impressive life. She excels computer science and has worked for IBM and even helped create a computer game (The Hobbit) while in college that won many awards. It was fun to hear her life story and she gave us some advice on what to expect/not expect with our future career paths. In the afternoon, I worked on writing the program to export the data to excel, only instead of creating a new excel file for each iteration, it will only be 1 excel file with each iteration of data appended to the existing file.

And for something not work related:
This Sunday, I am going to take a trip to Tillamook with 2 of the graduate students and the other interns for a beach trip. It should be fun—lots of cheese and ice cream!