Speech:Spring 2017 Mark Tollick Log

From Openitware
Jump to: navigation, search


Week Ending February 7th, 2017

Plan

2/2/2017

  1. Going to Andrew's house tomorrow to set up server, and discuss options in person, versus hangouts
    • Includes setting up file sharing server
  2. Attempt to gain access to the Server room, once we get the green light from Jonas on the fobs we need with proper permissions

2/3/2017

  1. Checking in, reading new posts, and Vitali's post about how he ran and added experiment

2/4/2017

  1. Andrew and myself have set up the server
    • Ensured that Bonnie and Julian are tracking on our changes we have made

2/7/2017

    • Meeting with Andrew to discuss future steps
    • Looking at iDRAC investment to be able to remotely control the server, reboot, start, and re-install ISO's without physical access
    • Looking into open source management tools, such as Ansible, salt, chef, and other open source software for configuration management of new servers, and ISO's
    • Also looking into Nagios for open source server management, which includes reporting management
    • Decoding the train and view results from test train we ran
Task

2/2/2017

  1. Find out what we need to do, have a meeting and discuss options and scheduling for meetings and discussions.
  2. Get ideas jotted down for proposal, along with discuss each others knowledge on what we're good at, what we need work in
  3. Look at getting the IRC server set up on the Rome server for class discussion, we have Slack now, but IRC is installed
  4. Need to look and check status of backup server. Ensure that there's enough space, cron job working

2/3/2017

  1. Checking in, and reading logs and becoming familiar with how to add experiment

2/4/2017

  1. Set up owncloud server at Andrews house
    • Ensure secure connection
    • Establish VPN for team members to be able to connect on
    • Set up first experiment on wiki and /Exp directory
    • Run first train

2/7/2017

    • Research tools to be used
    • Decode train results from Saturday


Concerns

2/2/2017

  • Jona's had stated that we can have a sharing system, as long as it isn't controlled by someone who isn't us, or at least that's how I interpreted it.
Before we commit to it, we should run it by him.
  • Need to get FOB to gain access to server room so we can start getting hands on, look at services running on the Rome and Caesar machine.

2/4/2017

  • Concerned about train running, but haven't received any errors.
  • Will review results at a later time

2/7/2017

  • Budget concerns, even though iDRAC are cheap and give systems team access to server even in the case of malfunction
  • Open source software needs to be approved before we can put it on, also should it be under tools or under systems? I think systems, due to it reflecting important
Results

2/2/2017

  1. Set up file sharing server for our own internal owncloud server for our own use at Andrews house
  2. Add experiment to our wiki page, have successfully done so, and created experiment 0297 with sub experiment 001, and have ran a train

2/4/2017

  1. Set up VPN and file sharing access for our group
  2. Add experiment for our wiki page, and in the /Exp directory
  3. Confirmed that VPN works with team members, along with owncloud

2/7/2017

  1. Got some pretty solid ideas, need to seek approval
  2. Decoded the train, need to read more logs and information about how to interpret the data I'm looking at.
  3. Good start to the semester, with strong ideas, and now we need to work toward that goal of continuing the trend

Week Ending February 14, 2017

Plan

2/9/17:

  1. Look at drones and see which ones are having issues
  2. Reach out to Jon Shallow
  3. Follow up for badge for access to room

2/12/17:

  1. Review logs, lost power so internet and computer is down. Using mobile, and staying warm.

2/13/17

  1. Got generator running, still no power from main line, don't want to risk surge on main PC, using mobile phone and continuing to read logs.

2/14/17

  1. Have meeting with team to discuss options
  2. Reach out to Jon to check status on server he had worked on
  3. Look at other machines, as my power issues have been resolved
Task

2/9/17

  1. SSH into machines, see what services are running
  2. Unable to reach out to Jon, as I don't have his contact information, will attempt at a later point in time

2/12/17:

  1. Unable to log in as power was knocked out due to accident involving a car and telephone pole. Read wiki on phone

2/13/2017:

  1. Issues with power continued, so continued to read logs.

2/14/2017:

  1. Meet with group via hangouts
  2. Set up and run another train to see if we can get results
  3. Reached out to Jon Shallow
Results

2/9/2017

  1. Still no badge access
  2. Andrew has results from service checking on the servers

2/12/2017

  1. Read logs to get a better understanding of the system

2/13/2017

  1. Read more logs, as power was still an issue leading into the night

2/14/17

  1. Heard back from Jon, continuing comms with him
  2. Andrew is running train with my guidance and Bonnie and Julian following along as well
  3. Still trying to figure out what's going on with machines exactly
Concerns
  1. No fobs to gain access to server room
  2. Limited comms with Jon Shallow
  3. Some servers still not able to be accessed
  4. Snow and wind has caused issues with my home internet and power

Week Ending February 21, 2017

Plan

2/16/17

  1. Get drones up and running for the other groups to start working on
  2. Run a successful experiment
  3. Figure out how to break /usr/local links on obelix and idefix

2/17/17

  1. Get root password on Majestix reset using RHEL disk
  2. Get Jonas to create user accounts on Majestix

2/18/17

  1. Checking in for logs

2/20/17

  1. Configure Majestix for use on the network
    • Mount /mnt/main for use with essential files
  2. Have user accounts created for majestix so SSH-Keygen works
Task

2/16/17

  1. Hashed out in class which group is getting which server, Jonas directed
  2. Make sure servers are accessible by all groups
  3. User accounts created on every computer except Majestix

2/17/17

  1. Finish running experiment, hopefully get a successful decode, unlike last time
  2. Reset Root password on Majestix using RHEL 6.6 disk and recovery module
    • Boot into recovery disk
    • Go into recovery module
    • mount root directory to work on
    • run command root passwd to change root password
  3. Accounts created for Majestix so SSH-Keygen works

2/18/2017

  1. Checking in for logs

2/20/2017

  1. Have a look at majestix to ensure correct configuration after root password reset
  2. Get majestix able to be seen by other machines, and vice versa
  3. Link to /mnt/main
Results

2/16/2017

  1. Groups have their respective machines, and can SSH into them. SSH-Keygen is established for most groups, to my knowledge
  2. Ran a successful experiment with results
  3. Unable to see how symbolic links are made and broken, Jonas was moving far too quickly through the script

2/17/2017

  1. Root password has been reset successfully
  2. Jonas can't access due to /mnt/main not being added properly
  3. Need to add /mnt/main in order to get it worked out

2/18/2017

  1. Had meeting with group, discuss proposal update
  2. Checking in on logs and the likes

2/20/2017

  1. Finished configuring majestix
    • Machine now talks to other machines and can be seen by other machines
  2. /mnt/main has been added to majestix, so now SSH-Keygen will work
  3. updated /etc/hosts with all machines so keywords are now associated with IP address
  4. Added ASCII art to Majestix upon ssh login
  5. Jonas ran script to add users
    • Tools group successful in getting into the system
Concerns
  1. Machines are inconsistent, substantial differences between different drones and caesar etc
  2. Machines aren't automatically connecting to eth0 upon reboot, causing machines to not have any connectivity to each other if system restarts
  3. Rsync doesn't seem to be working, from what I saw, need to investigate further
  4. Need to look at backup machine see when most recent backup was completed
  5. Need to look into network bridging as there are two eth ports on servers, so pass through internet using Caesar
  6. Need to have Tools group install GCC-C++ on Rome so we can finally finish the installation of IRC, that was the hang up last year

Week Ending February 28, 2017

Plan

2/22/2017

  1. Figure out internet access issue
  2. Get machines all synced up with non-browser UNH registration with MAC address

2/24/2017

  1. Licensing issues still around, need to make sure that we get that resolved so we can install packages via YUM installer
  2. Internet is per machine with extra ethernet, must prioritize machine by group/task

2/27/2017

  1. Check on server, idefix, why it won't display over VGA

2/28/2017

  1. Installed switch where we can route internet access to multiple drones on one connection
Task

2/22/2017

  1. Internet is important to these machines. They SOMETIMES rely on internet, but it is important as we then don't have to go into school, and plug in the Ethernet cable into a port which takes 3 seconds.
  2. Registered system with non-browser UNH so when we do plug in the ORANGE Ethernet cable they get access quickly, instead of needing to restart services on the machine every time you change the environment

2/24/2017

  1. Figure out licensing issues, who has access to the key so we can get it up and running
  2. Rome needs license to install GCC-C++ in order to finalize IRC installation
    • Internet is functioning on Rome

2/27/2017

  1. Idefix will not display over the VGA port as the other servers do. I don't know why this is happening, but I do know that the other servers work

2/28/2017

  1. Andrew and myself configured and set up a switch from an old router, which allows for the drones to have internet access at the same time
  2. Checking configurations of the machines to ensure that they are operating the way they should be


Results
  1. Licensing has been figured out to be linked to an activation code. Will be attempting to register it on 3/1
  2. Internet is still fly by wire, one at a time, so we need to implement a prioritizing system for group/server
  3. Rome still doesn't have access to YUM, so IRC is still on hold, will try to register 3/1 and track progress
  4. Idefix VGA port is now functioning after a reboot, unsure of what the issue was
  5. Switch was working, was notified by Jonas that we cannot use it due to UNH finding out and billing for extra "connections"
  6. Disconnected switch, so we're back to the plug and play method, so that'll have to do for now.
Concerns
  1. Need to get this license fixed, or point it at CentOS servers in order to get packages
    • Not an issue to install independent packages, but YUM installs dependencies as well as the package desired, for user simplicity
  2. Need to install and test torque on one of the machines
  3. Have good ideas, need to put in the wiki what will and what doesn't work, I find it to be very difficult to navigate at times, and "updated" information is updated, but the old information isn't necessarily deleted. Can cause confusion

Week Ending March 7, 2017

Plan

3/1/17

  1. Figure out WLAN card that was given to us by Prof Jonas
  2. Work with Tools to transfer ownership of obelix to them, then we get majestix

3/4/17

  1. Try running train ran on drone, haven't done so before
  2. Get group up to speed with speech training and decoding

3/5/17

  1. Troubleshoot drone issues
  2. Begin reading documentation on how to reinstall and set up new drone on Majestix, as re-install is required

3/7/2017

  1. Upon further investigation into drone issues, symbolic links aren't in place, so we cannot do a decode
  2. Need to figure out how to do train on local machine using local copy of /mnt/main/local


Task

3/1/17

  1. Explore opportunities to extract and install a driver usable by the USB wifi stick
  2. Establish timeline for tools to review and transfer their server over to ours

3/4/17

  1. Run train and decode on drone
  2. Held a hangout with group to start becoming more proficient, as a whole, with sphinx and the training and decoding process

3/5/17

  1. Drone is having issues with decoding, need to research and update on status
  2. Need to re-install RHEL on majestix this Wednesday

3/7/17

  1. Attempt to run a train again
  2. Need to look into copying /mnt/main/local into our own directory
  3. Reach out to other groups for possible assistance, see if it works for them
Results

3/1/17

  1. Andrew has taken the lead on this, he has extracted a tarball file with the command line to extract the installer, and seeing if it works on rome
  2. Tools group will reach out to us when they complete their scan of the system

3/4/17

  1. Attempted to run train and decode, train has worked, decode hasn't, short on time, need to revisit.
  2. Group is getting up to speed in speech recognition

3/5/17

  1. Trains and decoding are having issues on the drone, need to revisit when I have more time to do the troubleshooting
  2. Install still slated for Wednesday, should be good to go, barring any roadblocks

3/7/17

  1. Ran another train to see if issue was in running the train. I think that the issue may have been command oriented
    • That wasn't the issue, same issue came up. Reading decode log, come to find out that apparently sphinx isn't seen when running the decode
    • Realized we don't have our /usr/local copied over, nor do we have a symbolic link, tried to run command ln -s /usr/local /mnt/main/local, keeps saying file exists, so not really sure what the issue is. Need to figure it out
    • Did copy over the /mnt/main/local into a dir I had made called /usr/local-OFF
    • Will have to test later on tonight, when I have more time and not in class
     ,-----------------------------------------------------------------.
     |                            hyp.trans                            |
     |-----------------------------------------------------------------|
     | SPKR    | # Snt # Wrd | Corr    Sub    Del    Ins    Err  S.Err |
     |---------+-------------+-----------------------------------------|
     | sw2001b |    1      3 |100.0    0.0    0.0   66.7   66.7  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2005a |    2     42 | 81.0   14.3    4.8    4.8   23.8  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2006b |    1     29 | 69.0   20.7   10.3    6.9   37.9  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2007b |    2     39 | 92.3    7.7    0.0    2.6   10.3  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2007a |    1      7 | 57.1   28.6   14.3    0.0   42.9  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2008b |    1      3 |100.0    0.0    0.0    0.0    0.0    0.0 |
     |---------+-------------+-----------------------------------------|
     | sw2009a |    1     32 | 65.6   28.1    6.3    3.1   37.5  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2009b |    1      3 |100.0    0.0    0.0    0.0    0.0    0.0 |
     |---------+-------------+-----------------------------------------|
     | sw2010a |    1      5 | 40.0   60.0    0.0   20.0   80.0  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2012b |    2     90 | 85.6   12.2    2.2    2.2   16.7  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2013b |    1      5 | 40.0   60.0    0.0    0.0   60.0  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2013a |    1     17 | 47.1   52.9    0.0   17.6   70.6  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2014a |    1     21 | 76.2   19.0    4.8    9.5   33.3  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2015b |    1     43 | 72.1   18.6    9.3    2.3   30.2  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2017a |    1      3 | 66.7   33.3    0.0  133.3  166.7  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2018b |    1      3 | 66.7   33.3    0.0    0.0   33.3  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2018a |    1     12 | 66.7   33.3    0.0   33.3   66.7  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2019b |    1     38 | 86.8    7.9    5.3    0.0   13.2  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2019a |    1      8 |100.0    0.0    0.0   12.5   12.5  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2020a |    2     53 | 67.9   18.9   13.2   11.3   43.4  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2020b |    1     45 | 57.8   33.3    8.9   13.3   55.6  100.0 |
     |---------+-------------+-----------------------------------------|
     | sw2022b |    1      3 | 66.7   33.3    0.0  100.0  133.3  100.0 |
     |=================================================================|
     | Sum/Avg |   26    504 | 74.8   19.6    5.6    8.1   33.3   92.3 |
     |=================================================================|
     |  Mean   |  1.2   22.9 | 73.0   23.4    3.6   20.0   47.0   90.9 |
     |  S.D.   |  0.4   22.8 | 18.6   18.3    4.8   35.1   40.7   29.4 |
     | Median  |  1.0   14.5 | 68.4   19.9    0.0    5.8   37.7  100.0 |
     `-----------------------------------------------------------------'


Concerns
  1. Need to figure out issues with drones
    • Not isolated to our drones, so we can't run our own experiments
  2. Sometimes unable to to umount -a on /mnt/main/, which can be problematic with installation and configuration of new machines
  3. Ensuring that everyone knows to umount -a on the machine they're installing software on, so as to not muck anything up

Week Ending March 21, 2017

Plan
  1. Gain knowledge with Torque
  2. Have fun on spring break
Task
  1. Investigate Torque
Results
  1. Installed Torque on Majestix with Andrew.


Concerns
  1. Torque is kind of a pain, can make it both machines can queue jobs, unknown if we can do two clusters, or separate machines by their tasks, I.E. one group gets a cluster, other group gets a cluster and they can queue jobs in conjunction with each other, probably need separate installs on the MOM machines (read documentation for MOM reference)
  2. Nothing else to note

Week Ending March 28, 2017

Plan
  1. Check status of backup
  2. Investigate why backups are or aren't working, then find out how they work
  3. Assist with Torque installation
Task

3/22/2017

  1. Backup status and chasing down wire to server room in tech consultant office
  2. Attempt to clean up wires

3/23/2017

  1. Assist in torque installation
  2. Figured out Majestix and Rome were supposed to be the Torque installations, not Majestix and Miraculix

3/24

  1. Checking in

3/25/2017

  1. Checking in

3/26/2017

  1. Chasing down loose ends on the backup and how rsyncsnapshot

3/28/2017

  1. Need access to the tech consultant room to be able to see the VM storing the backups to get an idea of when last backup was
Results

3/22/2017

  1. Backups are currently being run using a rsyncsnapshot.conf file, not a cron job
  2. Wires will need to be cleaned up at some other time, we need a clearer picture as to what's going where, thinking about labels

3/23/2017

  1. Assisted with torque installation and removal on Majestix and Rome. Made it to step three of installation, but we're packing it in for today.

3/24/2017

  1. Checking in

3/25/2017

  1. Checking in

3/26/2017

  1. Read configuration file, kind of have an idea as to what's going on with this backup

3/28/2018

  1. Need to get on sync with Vitali, he also had mentioned I could SSH into virtual environment, will need to look into that so I can see what backups are being made, and what exactly is being backed up.
Concerns
  1. Backups status not 100% known
  2. Wrong installation of Torque, not Miraculix and Majestix, Miraculix and Rome is the right config per DR. Jonas
  3. Wires need to be looked at to be cleaned up in the future, with labels for future classes

Week Ending April 4, 2017

3/29/17

Task
  1. Check status of backups running on rome
Results
    • Backups are running via rsnapshot, but we're redoing the switch, so want to figure out network connections before we mess with that too much
    • Checked out server running the backups, it is running Ubuntu Server, unknown version
Plan
    • Track down where the ethernet cable goes
    • Get into room where backup server is located
Concerns
  1. Backups are supposed to be running, but unfortunately won't be able to get to test it out until we set up the new switch and get that running.

3/30/17

Task

Checking in

Results

Checking in

Plan

Checking in

Concerns

Checking in

3/31/17

Task

Checking in

Results

Checking in

Plan

Checking in

Concerns

Checking in

4/3/17

Task

Checking in

Results

Checking in

Plan

Checking in

Concerns

Checking in

Week Ending April 11, 2017

4/5/2017

Task
  1. Review physical setup of backups
    • Check physical connectivity
    • Make sure IP addresses are the right IP addresses
Results
  1. Backup server and Rome are seeing each other, able to reach via ping
  2. Losing a lot of packets, which is cause for concern as sending packets were causing huge amounts of losses
  3. rsyncsnapshot is running and installed, and appears to be configured correctly
Plan
  1. We need to make sure that the backups are running smoothly once we reach out to IT and see if they can troubleshoot the issues
  2. Reach out to UNHMIT, see if they can assist us with troubleshooting the connection


Concerns
  1. High ping loss is causing instability
  2. Could lead to ineffective backups
  3. Tested connection with ethernet testing tool
  4. Attempted to pull the switch off and have it direct into the board, didn't help packet loss
  5. Need to investigate further with assistance from UNHMIT

4/7/2017

Task
  1. Checking in
Results
  1. Continuing to read documentation about rsyncsnapshot, and what the previous class did to set it up
Plan
  1. Continue to become familiar with backup software configuration
Concerns
  1. packet loss is still an issue

4/10/2017

Task
  1. Looking into Torque on Rome and Miraculix
  2. Need to look at backing up Caesar before G++ installation from tools group
  3. Research backup options and where to back it up to.
Results
  1. Realized that we need to have local user accounts on Rome and Miraculix, cannot run Torque as a root user
  2. Researching rsync to do a local backup on the environment, need to figure out which machine will hold onto the backup, as the backups aren't functioning to the Ubuntu VM in the tech consultant room
  3. Reached out to Tools group to coordinate with them, need to know:
    • When they plan on doing it so we can ensure backups are in place
    • Ensure smooth backup of Caesar
Plan
  1. Coordinate with Tools group to assist in installation of G++
  2. Further troubleshoot packet loss issues with the network between Rome and Backup server


Concerns
  1. Packet loss is extreme, reached out to Jonas
  2. Need to backup Caesar in case of failure from G++ installation

4/11/2017

Task
  1. Reach out to tools
  2. Investigate WiFi dongle issues
Results
  1. WiFi dongle has issues connecting to UNH-Secure using the CA and user certificate, which would allow the WiFi dongle to be on UNH-Secure permanently, instead of being kicked to the login page every 30 minutes on UNH-Public
  2. Tools is now aware we are going to conduct a back up on Wednesday, so they can continue with their installation of G++


Plan
  1. Andrew and I are going to investigate the packet loss issues more in depth
  2. UNHM-IT has responded that they have submitted a ticket, however we are still going to test so we can get a better idea of what the issue may be, may be a routing issue, unsure
    • Wire diagnostics we ran last week showed that the wires had a solid connection
Concerns
  1. Backing up Caesar
  2. Packet loss issues so Rome can continue to backup accordingly

Week Ending April 18, 2017

4/12/17
Task
  1. Run test on both ends of network for backup between Rome and tech consultant room
    • Troubleshoot using laptops, and then trying to hit a machine with one laptop on one side, then the other way around, see if it's the servers, or the connection itself
  2. Look into PBS_Scheduler with Torque on Rome and Majestix
Results
  1. Was able to ping rome from tech consultant room using a laptop no issue
    • Upon further investigation, the lights on the network adapter of the Server are FUBAR, red light with blinking amber light, indicating connectivity problems
    • There is a network card put in the server, may try to connect to that and see if packet loss is better
  2. Attempted to install Maui on Rome as a scheduler, instead of pbs_scheduler


Plan
  1. Cancelled ticket at UNHM-IT, due to packet loss NOT being a wiring issue, but being a server issue
  2. Still working on getting pbs_scheduler running, or Maui scheduler running with Torque
  3. Investigate whether a reboot will resolve the issues, as it could be the issue, need to speak with Viatli about this
  4. If reboot doesn't resolve, we must look into using the network card that is readily available on the machine, however this card doesn't show up in the Hyper-V VM that we are using on the windows server machine.
Concerns
  1. Server network card not being detected
  2. pbs_sched not working, limited and poor documentation
  3. Networking cabling is running fine, issue with server where backups are being STORED, not Rome
4/13/17
Task
  1. Still looking into backup of Caesar, before we move on, where we're going to put it, etc.
Results
  1. Going to use rsync
    • Need backup in the off chance the G++ installation gets botched, or changes files on Caesar, need to have image to backup from so we can proceed as normal
Plan
  1. Going to back up using rsync
    • Will ensure backup image will be able to be restored in case of failure
Concerns
  1. Information that needs to be backed up, root directory, entire system?
  2. We need to turn Brutus back on, because I'm fairly certain that is going to be the backup machine we're going to go to
4/17/17
Task
  1. Investigate torque some more
  2. Check to see if Brutus is up, would've turned it on after class today, however class was cancelled, and wasn't in area of school
    • Brutus needs to be up to backup Caesar to, the sooner this gets done, the sooner they can install G++


Results
  1. Gained somewhat a more familiarity with scheduling, but need to work on it more
  2. Brutus isn't up, cannot do backup


Plan
  1. Turn on Brutus after class tomorrow, back up using Rsync tomorrow
  2. Assist tools with G++ installation after back up of Caesar
Concerns
  1. Get backup so tools group can proceed
4/18/17
Task
  1. Worked with Andrew to back up / directory to Rome.
    • Directories with important information has been copied over to Rome
    • Should be able to move them back if there is a need to be restored
  2. Working with Torque scheduler to test if it works faster or slower than single machine train running
Results
  1. Backup of / directory has now been stored on Rome
  2. Brutus is no longer needed, since we're not backing up to Brutus
  3. Need to run speed test comparisons on trains between torque and not torque train


Plan
  1. Create baseline with Torque running a train versus a single server running a train
  2. Tools should be able to install G++ since backup is now on Rome
Concerns
  1. Need to figure out how to use Torque and ensure that both machines are using their resources towards the same project or train
  2. Caesar is backed up

Week Ending April 25, 2017

4/19/17
Task
  1. URC poster presentation
Results
  1. Looked into the tech consultant room, and was able to mess with VM
    • VM probably needs to be rebuilt, as it's not physically seeing the network adapter
  2. Backups will be unable to resume until this network connection is fixed
Plan
  1. Present URC poster
  2. Ensure that backup server is brought up in a timely fashion
Concerns
  1. Backups still down
4/20/17
Task
  1. Attempted to come into school to reconfigure VM on backup server, no one was present
  2. Checking in
Results
  1. Checking in
Plan
  1. Checking in
Concerns
  1. Checking in
4/21/17
Task
  1. Continue to troubleshoot server issues
  2. Figure out Maui and how to schedule sphinx jobs for trains and decodes
Results
  1. Read more documentation on Maui, messed around with it a little bit, but need to look into it more
  2. Server is not communicating, rebooted, still no success, going to need to rebuild


Plan
  1. Rebuild the server when possible, probably on class day
    • Once server is rebuilt, backup /mnt/main immediately
    • Once backup is complete, tools can proceed
Concerns
  1. Backups still not working
4/25/17
Task
  1. checking in
Results
  1. checking in
Plan
  1. checking in
Concerns
  1. checking in

Week Ending May 2, 2017

Task


Results


Plan


Concerns


Week Ending May 9, 2017

Task


Results


Plan


Concerns