Speech:Spring 2016 Michael Salem Log


 * Home
 * Semesters
 * Spring 2016
 * Proposal
 * Report
 * Information - General Project Information
 * Experiments - List of speech experiments

Week Ending February 9, 2016

 * Task:

Figure out why Caesar has a delay before the login prompt appears.


 * Results:

Did some googling and found that you can set a parameter in the /etc/ssh/sshd_config directory called "useDNS".

Here is a copypaste that explains what the command does: ""UseDNS no" only prevents sshd from performing a validation of the client's reverse lookup. That is, if you connect with a client whose hostname resolves to a different IP address than the one with which it connects, the server will reject it if UseDNS is "yes", but allow it if "no"."

More research is needed here... I'm not sure I understand when they say "the host name resolves to a different IP than the one with which it connects". I had no idea a hostname resolves to an IP... I was under the impression that the IP that the client connects with is the only IP that it receives??


 * Plan:

Need to change the setting in the aforementioned directory to test if it works or not. Perhaps I could test this out on one of the drone servers for obvious reasons.
 * Concerns:

Haven't tested this yet because I do not fully understand the ramifications of the parameter. Most signs point to it being a harmless change. I want to talk to the Prof or more peers to get the go-ahead before I change any settings on Caesar.

2/13

 * Task:
 * Change password for my user account on Caesar.
 * Create our experiment folder and run the first trains.
 * Clone the operating systems of the drones we will be replacing.
 * Rack up the 1950s in place of the older model drones.
 * Fix the disaster of a cabling situation behind the servers. The zipties on the cables hurts my brain.


 * Results:

Well, I figured out how to change my password on Caesar easy enough.


 * Plan:


 * This week I want to create the team's experiment folder and get a few experiments in so I can see how this whole thing works. I would like to do some additional reading before I potentially blow up the project...
 * I also plan to bring in my 64g thumb drive and clone the operating systems of the drone servers. Prof Jonas thinks we should be able to pull it off with small USB sticks, I tend to agree. The servers only have 73GB drives, so it shouldn't be a problem. The rest of the team will be bringing in some USB sticks as well to make the process faster. If all goes well we should be able to rack up all of the new servers in one day. We'll likely run into a few speed bumps.


 * Concerns:

Possibly blowing everything up:)

2/14
Logged in and read logs 2/14
 * Task:

2/15

 * Task:

Set up the Systems team's experiment directory
 * Results:

I finally figured out how to run the dang scripts, I had an ah-ha moment when I remember that Mike put "perl" in front of them before running them... lots of wasted time and frustration! Glad to see that Ben Leith's problem with the script was fixed by Mr. Heyner. Got the folders all set up and am ready to run our first train! Train is running as of 7:15P!! I will check in tomorrow afternoon to see how the train went.


 * Plan:
 * Remember how to run scripts
 * Set up experiment folder for our initial experiments as a team
 * Run first train


 * Concerns:

I'm eager to see how this train goes... Linux is fun when things actually work out, so hopefully nothing is on fire when all is said and done with this train

2/16

 * Task:


 * Create language model
 * Run decode
 * Results:

I believe I've created the language model? I need to figure out how to see if the train worked properly before running the decode. According to the instructions.

Read lots of logs and the instructions...
 * Plan:


 * Concerns:

At this point I feel like I'm just wading through the ocean. I have no idea what I've done so far or what I need to do to see if the train worked... More reading must be done, so I'm holding off on the instructions/decode until I figure out what the hell is going on.....

2/17

 * Task:
 * Install as many servers as possible
 * Get Redhat installing on as many servers as possible
 * Figure out which servers are which due to poor labeling last semester


 * Results:

We installed 4 out of the 5 1750s with 1950s and got Redhat installing on Asterix. We had some trouble with the rails of the ancient servers at first, but we eventually figured them out and got the servers out. We also had some trouble installing the newer rails into the rack due to very poor design choices at Dell. We ended up figuring that out too, and getting the servers installed correctly.


 * Plan:

Remove the CORRECT 1750s and install the newer servers. Possibly help out the cabling situation behind the rack. Install the new servers without destroying anything or destroying any cabling. Install redhat using the instructions from last year's system team.


 * Concerns:

It's worth noting that the 1750s that are currently installed are installed incorrectly and are pretty flimsy in the rack. The rails are pretty confusing especially for those super old servers. Some of the cabling behind this rack is atrocious. Particularly these old KVM cables. KVMs are almost always a problem. Some of the display cables were literally stuck inside the servers and we had a lot of trouble pulling them. The servers were not changed from last semester when the system team shuffled things around a bit, so we were very concerned and careful about which servers were getting pulled. We had to ask and make sure many times.

2/19

 * Task:
 * Install Majestic's replacement server
 * Move the Redhat installation along as far as I had time for.


 * Results:

After some deliberation I figured out which server I needed to pull, and replaced it with the newer 1950. I also finished off the installation of Redhat on Asterix, and moved along to Miraculix.


 * Plan:


 * Figure out which server is the proper server, remove it, and replace it.
 * Kick off installation of redhat on Miraculix


 * Concerns:

I came in thinking Methusalix was the server I needed to replace, but discovered that this server was already a 1950. I went back through my emails and found Zach's mention of what servers he is using this semester, and Majestix was the only 1750 that he wasn't using. Once I finished off the install of Redhat on Asterix, I couldn't remember the root password, so I didn't finish up the networking. That still needs to be done.

Week Ending March 1, 2016
2/24


 * Task:

Troubleshoot install issues for the remaining servers. Install Redhat on remaining servers. Configure networking on Asterix

We ended up getting Redhat to begin installation on Majestix. Asterix network configuration achieved!
 * Results:


 * Plan:

I initially thought the problem with the install on Majestix was due to a failed disk, but once I got into the server room I realized the other servers were having the exact same issue. The problem we were having was that during the Redhat installation, it looks for a disk to partition for installation. At this point in the install, no disks were being found. The other group members were convinced that the problem was the ROMB Battery being under a 24hr charge. I didn't think this was the issue. I immediately went for the SAS controller (Ctrl+R on boot) on startup, and found that the SAS controller was seeing the drive, but it was foreign. I found no way to clear the current config, so I restarted the server. On boot, there was a short, one line prompt asking for a key press to fix foreign disks. This prompt allowed us to wipe the disks and make Redhat see them. At this point, we had been in the server room for a couple hours, troubleshooting and looking for a cause. I also followed the directions Muhammad left last semester to configure the networking for Asterix, which was pingable from Caesar. The last thing left to do on Caesar is point it to Caesar's mnt/main.


 * Concerns:


 * The other servers still have to have Redhat installed on them and have their foreign configuration cleared.
 * Asterix needs to be pointed to Caesar
 * The other servers need this also plus networking

2/26


 * Task:

Worked on the proposal with the team.

2/27


 * Task:

Worked on the proposal with the team.

3/1

 * Task:
 * Finish Redhat Installations
 * Another member of the team mentioned drive problems with Idefix.


 * Results:

Idefax had a bad HDD, I replaced it and began installation. Installs completed using last semester's notes on the remaining servers except for Idefix. See concerns.


 * Plan:

Finish the installations on the remaining servers. Troubleshoot a possible dead drive.


 * Concerns:

Obelix is only showing a CLI. This indicates an incorrectly installed operating system. It might have to be reinstalled. Methusalix appears to be stuck on boot? Left it alone for now...

3/2

 * Task:
 * Give the tools group Majestix to use as a test server.
 * Assess the GUI problem on Obelix.
 * Configure the network settings for Obelix, Idefix, Majestix, and Miraculix.
 * Point the /mnt/main directories for the drones to Caesar using the instructions from last semester's systems group logs.


 * Results:

The systems group logs from last semester are for something else.
 * The tools group now has control of Majestix.
 * The team has reinstalled the OS on Obelix, and it is now identical to the other servers.
 * Got the network configurations finished on all of the servers.
 * After much confusion, I discovered that the correct instructions for pointing /mnt/main to Caesar are at this link:


 * Plan:


 * Decide with the group all together whether we need to reinstall Obelix.
 * Finish the network configurations for the previously mentioned servers.
 * Follow instructions to point /mnt/main to Caesar on the servers.


 * Concerns:

Now we that we have the drones all pointed to Caesar, we have to run a test train on each drone to see how they compare in speed. This can be accomplished by using a cli command to time the train.

3/7

 * Task:


 * Review the team's changes to the wiki documentation over the last week
 * Hardware page has been changed
 * Redhat install instructions changed
 * Systems group logs changed
 * Neil finished the train I started and I need to read the experiment logs


 * Result:


 * Went over the experiment logs that Neil finished, I think I have an idea on how to finish the trains... he said the directions/commands on the wiki are pretty much to the T correct

3/12
The experiment that I ran earlier in the semester has no output on the wiki. I need to find the logs file for the experiment and put it on the wiki.
 * Task:


 * Results:

I think this experiment is FUBAR. I need to run another one correctly in a new sub experiment.

Here's the what is being output to my scoring.log....

sclite: 2.3 TK Version 1.3 Begin alignment of Ref File: '001_train.trans' and Hyp File: 'hyp.trans' ^M   Alignment# 1 for speaker sw2001b ^M   Alignment# 1 for speaker sw2001a ^M   Alignment# 2 for speaker sw2001a ^M   Alignment# 2 for speaker sw2001b ^M   Alignment# 3 for speaker sw2001b

I've run additional scores to see if that was the problem but to no avail.


 * Plan:

Dig through the files of the sub experiment and find what went wrong.


 * Concerns:

Neil ran the decode on this experiment, and I wasn't present when he did.

'''UPDATE: I'm an idiot. Didn't realize you have to scroll down to see the results. '''

3/20

 * Task:

I need to run a 5hr train on Obelix and Idefix this weekend. I didn't realize that Jonas wanted this done ages ago. I need to take the time for the train and compare it to Caesar's completion time, which is 2300 seconds. The reason Idefix and Obelix were chosen is because one has a 10k speed drive and the other has a 15k.


 * Results:

Will check back on this one tomorrow. It looks like the servers are being used by someone right now. I have time to get this done, it shouldn't be too long. The directory structure is set up.


 * Plan:

Follow the new instructions the modeling group has created and make sure they're followed to the T. I want proper scoring.log files this time. I'm going to run one on Idefix tonight and Obelix tomorrow.


 * Concerns:

I'm worried that I will waste a train and get another FUBAR experiment. Last time was a big problem.

3/21

 * Task:

Tonight I need to get these trains run so I can check on them tomorrow and record the results for Wed's class.


 * Plan:

Follow the new instructions the modeling group has created and make sure they're followed to the T. I want proper scoring.log files this time. I'm going to run one on Idefix tonight and Obelix tomorrow.


 * Concerns:

Hopefully no one is using these servers tonight.


 * Results:

Will check on these tomorrow. The trains are running now.

3/22

 * Task:

Get the decodes and scorings done and update the wiki with the results.


 * Plan:

Follow the instructions for the lang model and decode. Put results on the wiki with comparisons to Caesar.


 * Concerns:

Roadblocks will be a major annoyance. These need to be in tonight.


 * Results:

3/23

 * Task:

Install Redhat on the new Rome, which is the Dell 2950 at the top of the rack Set up internet access on Rome so that we can download and host an IRC channel on the drone.


 * Plan:

Install Redhat using the instructions from the previous installs. This unit will not be pointed to Caesar's /mnt/main like the other drones are.


 * Results:

Redhat install went as planned. No hitches in the road. The roadblocks came when trying to set up the internet connection. We started by setting up the routing table the same as Caesar and the drones. Those servers have an IP that is reserved for Rome. This all went well, but we were never able to connect Rome to the internet. It is talking to the other servers just fine, but we need internet to install IRC on Rome. Since we can't get online with this unit, I think the best method would be to get the files on an external usb, then install it from that. We can set up IRC to work on our little LAN we have going.

3/26

 * Task:

Correct the mistakes from last week and get a working train running to time for our documentation. I will be running trains on Idefix and Obelix.


 * Plan:

I've heard from Neil that the modeling group fixed the generateFeats.pl that was broken from last week, causing my trains to be a mess. So, I will run the two failed trains from last week again (experiment 0287 is full of attempts). I will be attempting to use the "time" command to time the length of the trains. This will require me to be logged into the server for the length of the 5hr train.


 * Results:

It appears that generateFeats.pl is still broken... or I'm messing up somewhere. I prepared two separate experiments and both failed at the generateFeats.pl with these errors.

-cfg not specified, using the default ./etc/sphinx_train.cfg -param not specified, using the default ./etc/feat.params Failed to open control file etc/006_train.fileids: No such file or directory at scripts_pl/make_feats.pl line 99. Complete! Run "nohup scripts_pl/RunAll.pl &" to begin training.

This is the same error that Tom Rubino was getting in an earlier email chain, he seemed to have fixed it by running prepareTrainExperiment.pl as root.

3/27

 * Task:

Figure out why last nights train wouldnt run Run a train if possible on Idefix since Obelix is not cooperating with me.


 * Plan:

I sent and email last night to the modeling group asking for help with my problem, but I got completely ignored. Also by my group. It's the weekend I guess. I figured this means that my question was stupid, so I'll be doing some investigation to see why the train didn't complete. Go through the corpus to see why the first_5hr train will not run. It is failing at generateFeats.pl. Run train if possible


 * Results:

I found that the first_5hr train no longer exists. I had a feeling this was the case from the email chains between jonas and the modeling group. I found that there is a first_4hr, so I ran that. The generateFeats.pl completed successfully and I am now timing a train on Idefix using the time command. I'll just have to stay logged in thru the duration of the train to see the time it took.


 * Concerns:

Obelix is not reachable thru ssh on Caesar. Hopefully this time command works properly. If not, I'll have to figure out how to implement Jonas' method of editing the script, which I am not comfortable with, but I really don't feel like getting attacked in class.

3/29

 * Task:

Need to run trains tonight. See previous logs for information about planned trains. UPDATE:Will be running/timing the trains I run and documenting them in experiment 0287/008 & 009 I also need to go in to school today and take a look at Majestix for the Tools group. This server needs an internet connection so they can download Yum. Neil was unable to solve this issue.


 * Plan:

Read the logs of the modeling group to try and make sense of whether this train is ready to be tested or if I'm going to be jumped in class for trying to run it. UPDATE: James gave me the go-ahead to train on the new corpus. Will update with results.
 * Run 145hr trains on Obelix and Idefix
 * Record the time they took to complete using linux's built-in "time" command
 * Find out why Majestix will not ping out to google or dns.


 * Results:

Read the modeling groups experiment logs for the trains they ran while building the new 145hr corpus. Each train they ran fluctuated wildly in time to completion. I need more information. There's no reason to run a train if we know the time it takes to complete will vary wildly. Can't help feeling useless. Need to run some kind of training/decoding tonight. UPDATE: I could not solve Majestix's internet problem. Things I tried:
 * Configured Majestix's eth0 port, which is hooked up via Ethernet to the main router. I used Caesar's eth0 as a template and used the IP that Neil acquired.
 * Verified the routing table against Caesar, looked fine
 * Attempted to give Majestix the DNS servers that Caesar is using, but the resolv.conf file is being generated by dhclient, and will not carry over a nic restart. Wasn't able to stay longer to fix this. I need to run trains tonight.


 * Concerns:

I sent an email to the modeling group midweek about running a train with no response. This leads me to believe that they were either too busy to reply or it was stupid question so they ignored it. I don't really know if the train is ready to be run. I would like a briefing on it before I run it. I was told to hold off running trains last week until given the go-ahead. I believe through email chain that the train is fully operational and ready to be tweaked and tested.

3/30
Helping out Saverna to run a train
 * Task:


 * Results:

We finished a simple train for her to get some experience running a train...


 * Plan:

Teach her about the steps to running and decoding so she understands what is happening...


 * Concerns:

No way to learn all of this all at once this late in the semester...

4/3
I've been unable to do anything with the two timing experiments this weekend. No idea what is happening with Idefix and Obelix. Anyone trying to ssh into them get a host unreachable error. The school is inaccessible till monday.

UPDATE: The school suffered a power outage over the weekend. Neil fixed the issue on Monday morning. Going to complete the trains.

4/4

 * Task:
 * Need to build language models for the two timing experiments
 * Need to decode those two experiments
 * Need to score those two experiments
 * Need to document the results/time


 * Results:

Results of Idefix have been posted under exp#0287/009. Obelix was in use all night. Deferring until tomorrow.


 * Plan:

Nothing out of the ordinary. I'll be building language models then decoding tonight. Scoring tomorrow. Will be running the language model/decode on Idefix tonight. Obelix is occupied so I won't be disturbing it.


 * Concerns:

On tuesday Neil said we will be unable to run any training or anything of the sort, so this needs to get done ASAP.

I will be unable to get a language model for Obelix tonight due to "mru567" taking up 100% of resources training or decoding. I don't want to interrupt their work.

4/5

 * Task:

Need to time the decode on Obelix. Need to start a train on Caesar.


 * Plan:

Run the decode on Obelix using the time command. Run a default 145hr train on Caesar.


 * Results:


 * Concerns:

4/10
Need to research configurations for sphinx trains.
 * Task:


 * Results:

Found some good info about the configs I was looking for. Shared with my team.


 * Plan:

Google and use the resources on this wiki to research sphinx configuration in order to find the best baseline wer

4/11

 * Task:

Need to finish up the last train on Obelix to get a feel for how long the process takes in comparison to Caesar.


 * Plan:

The decode is the only thing left to do on this train. I ran it about a week ago and never got around to finishing it up. Results will be posted in 0287/008 along with the other timed trains.


 * Results:

The results are posted in the experiment log above.

4/12

 * Task:

The team is expecting fantastic results this week. :)

4/16
More research into sphinx parameters this week.
 * Task:

4/18
More research into sphinx parameters this week.
 * Task:

4/18
My research is looking good. Our next trains should have a lower WER after we apply my knowledge. Theoretically, at least. This type of thing tends to be pretty volatile, but I'm confident in my sources.
 * Task:

4/24
Looking over some of the research the other team members have come across. This week we will be building the IRC channel on Rome, finally. We also need to bring the internet to Rome as well.
 * Task:

4/25

 * Task:

Researching some of the parameters from the reading

4/26

 * Task:

We've decided against creating the IRC channel on Rome this semester due to time restraints. There are only 3 weeks left in the semester. Plus, Rome only has two ethernet ports which will impede Tom Rubino's plan to incorporate rsync into Rome. The system group will be documenting the IRC process for the next semester to use if they so desire. Researching parameters for the upcoming competition.

4/31
The two teams have joined into one team this week. The reasoning is that due to the current hardware/software constraints of the project, we have essentially come to a standstill on WER. We need to combine forces and create a respectable WER that the whole class can contribute their brain power to achieve. It turns out that both teams were actually pretty much in the same boat on scoring, and in fact, they had a better score than us. Either way, I will be creating the template for the final report and submitting it to the team for review, as well as researching the new sources that we've obtained from combining the teams.
 * Task:

5/1
The template is complete. The semester is winding down and the main brain consisting of Matt Jon Ben and James are chugging away... at this point I feel like all I can do is watch and stay out of their way...
 * Task:

5/3

 * Task:

The team is hard at work trying to get a final score before the last week of class.

Week Ending May 10, 2016

 * Task:


 * Results:


 * Plan:


 * Concerns: