Sunday, October 13, 2013

Network Attached Storage (NAS) Comes Home

These days it is not uncommon for home networks to rival the business networks of 10 years ago. Rather than having the family computer that everyone has to share, people may have more than one desktop, a laptop or two, tablets, and smartphones. This  can make finding files hard... they are saved in local folders by default, so if you created, say, an address list for mail merges on your laptop, but then need to refer to it later while on your desktop, it is not available. How can you make this work better?

You could set up a machine to serve as a file server (using something like a Windows HomeGroup, or more basic NTFS or SMB shares) and use it as a repository for your files. This works well, but it can be complicated to set up and maintain, and the files may not be accessible to all your devices. Using a cloud-based service such as SkyDrive also works, but there is still an accessibility question and you can lose access to files if your Internet connection goes down. A better solution may be picking up a Network Attached Storage (NAS) device.

NAS Devices

A NAS device is basically an external hard drive that plugs into your network rather than your computer. It has a small OS and connects directly to the router or main network switch. They have become common for corporate networks over the last several years as they are an inexpensive way to add storage, and now they are starting to enter the home market. Home NAS devices typically have a set of generic shared folders and they allow users to create their own folders (either private or shared). Hard drive space varies, but they typically range from 1 - 4 TB.


Home users typically use NAS devices for three main things: file sharing, centralized backup, and media streaming. A NAS may also allow for remote access to data.

File sharing

By providing a large amount of storage space available to all computers on the network, NAS devices make a great file sharing solution. Directories on the NAS can be mapped to computers as a network drive, allowing immediate access to files stored on the device.

Centralized backup

Rather than having separate backup storage (such as USB external drives) for each computer on the network, or moving one drive between computers and tying it up while the systems backup, users can set their backup software to use the NAS as the destination.

Media streaming

Most consumer-grade NAS devices incorporate Digital Living Network Alliance (DLNA) servers that allow them to work with equipment such as game consoles, smart TVs, DVD/Blu-Ray players, and other home entertainment appliances. This allows users to keep all their photos, music, and video files in a central location that can be accessed from multiple devices at the same time, rather than either needing to make copies of the files on each device or physically move an external drive between machines.

Remote Access

Most NAS devices also allow remote access to data. They either use an outside service to provide a web interface, or have apps for tablets and smartphones that can access the device from any Internet connection. In some cases the apps would allow media to stream remotely as well, but the quality would be dependent on the Internet connection on both ends. Depending on the remote access solution, you may also be able to share files with friends or upload files from remote locations so they are available at home.


A NAS device is a great addition to some home networks, especially if there are multiple devices on the network or a large number of media files to share across more than one device. Their device-agnostic nature means that users have access to their media and/or data from any network-aware device with minimal issues. For example, my home network has 3 desktops, 1 web server, 5 laptops (3 in regular use), 2 tablets, 2 DLNA-aware game consoles, and 2 smartphones. We can get to our files and stream media from multiple rooms simultaneously, which makes the NAS a perfect addition to our computer ecosystem.

Saturday, April 13, 2013

OEMs Aren't Always Right...

OK, the title is obvious. Who hasn't grumbled at the default system settings on a new computer, or immediately uninstalled all the clutter placed on a system by the manufacturer? To me, it's another argument for building (or at least designing) your own systems. But I digress... on to the post.

The dumbest thing I have done in a while is trust that the company that sold a server knew what was best when they set it up. By doing so, I decided to not make any changes to the overall system configuration before passing the "point of no return," where I would have to reinstall the entire system to change things.

I was helping a friend set up a server for his small business. Initially, I was thinking about getting a Windows 2008 server and setting it up as a Domain Controller and file server. When we were looking at systems, Microcenter had a server with SBS 2011 and decent specs for less than their basic Windows 2008 server system, so we decided to go with that. I won't get into the relative merits of SBS vs. other server options here, other than to say that I think that SBS can be a decent choice for a small business depending on their needs. But SBS isn't the problem. The problem is that the people at Microcenter seemed to feel that using RAID 5 was the best configuration for the HDD. After using the system for the better part of a year, I beg to differ.

As per Wikipedia (and everywhere else, but that is where I grabbed the short definition from), RAID, or Redundant Array of Independent Disks, is a storage technology that combines multiple disk drive components into a logical unit. RAID has several different configurations, designated by numbers. At the most basic level (RAID 0), it means that you can have, say, two 500 GB disks and set it up so the system sees it as a single 1TB disk. RAID 1 would take those two disks and provide fault tolerance by setting them up so the system sees a single 500 GB disk, with the data being mirrored on both drives, so if one fails, the system keeps operating and nothing is lost.  RAID 5 requires at least 3 drives, and allows the system to see 2/3 of the space (or more, depending on number of drives) as a single entity while providing fault tolerance in the event of a single drive failure.

While RAID 5 sounds great on paper, in practice, it can be painfully slow. This is because writes can  require several operations to calculate the parity bits used for fault tolerance. For small writes, this can use significant amounts of disk activity, reducing system performance. In some instances, this is fine, but with SBS, this lag it can be really noticeable - especially when trying to do things like perform maintenance that involves a lot of writes to the disk.

So now I am stuck as the part-time IT guy for a company that has a server that performs OK for the mundane day-to-day stuff (DNS server, file server, etc.), but takes forever when I actually need to do maintenance. I should have done what I briefly thought about doing during setup  - pull one drive to keep as a replacement in case of failure, and set the other two up in a RAID 1 configuration. Instead I believed that the people at Microcenter knew what they were doing, and that the RAID 5 was the best. And I can't change it unless I want to rebuild the system or try to figure out the best way to clone, shrink, and restore partitions. Since the system works fine for day-to-day, instead I have decided to grin and bear it, with each maintenance period serving as a reminder to do my research and follow my instincts as I move forward.

Do you have any experiences where you trusted the OEM when you shouldn't have? Let me know in the comments.

Wednesday, April 3, 2013

The Most Important Thing You Probably Don't Do

First of all, I want to say sorry for the big break between posts... I've been busy with work and haven't had time to write. Now work has slowed down a little bit, and I'm feeling relaxed and ready for some blogging. And what better topic to write about but something that anyone who uses a computer should do, but way too many people ignore. Of course I am talking about backing up your data.

I was recently reminded of how important backing up is when a friend contacted me about PC issues. They had a laptop that, after running trouble-free for about 2 years, suddenly just stopped working. As in wouldn't even POST (Power On Self Test) or make it to the manufacturer splash screen. He brought it over and I took  look. When you hit the power button, the HDD would make a slight sound (like it was trying to spin up) then stop. Other than that, nothing at all happened. I asked if he had a backup because it looked like he would have to send the system for repairs (and/or his HDD had potentially failed). He hadn't. He mentioned that he had a lot of important data and hoped it wasn't lost. Luckily for him, the HDD turned out to be good, and I was able to pull the drive from the dead machine and make a copy of his data before he took it in for servicing. He ended up getting a new PC (the problem was too expensive to fix compared to costs for new machines), and now regularly backs up his important data.


Backup methods

There are several schemes you can use to prevent data loss. They range from simply copying important files to a second location (such as an external drive or cloud-based storage) to installing a dedicated backup solution that uses media such as tapes (or the cloud) to back up your entire hard drive (or specific files). Each method has it's pros and cons, and the best choice depends on a few factors, such as how much data you have to back up, how critical the data is, and how much you want to spend. Here are descriptions of a few of them, in no particular order:

Keeping your important files in the cloud. This isn't a back-up method per-se, but it does serve to protect important files. It simply involves using a cloud-based service such as, SkyDrive, or UbuntuOne as your main document repository, and accessing and saving the files to the service rather than your local hard drive. Many cloud services have a desktop application that will automatically sync the cloud data to local folders to allow offline access, so it even does technically keep a back-up copy (however the file syncs when connected, so restoring old versions of files is not necessarily possible without some additional work).

Copying Files. This entails physically copying your important files to a different PC, external drive, CD/DVD, or cloud-based storage on a regular basis. This is a labor-intensive way to backup data, as you need to decide what files to copy, actively copy them, potentially manage media, etc. It can work as a quick way to ensure important data doesn't get lost because you will always have a copy somewhere, but most of the other backup methods are simpler to use (once they are setup) and more robust.

Using local backup software. This can cover a large number of options from built-in OS solutions to enterprise-level software packages costing several thousand dollars. There are several things that most software backup solutions have in common, including compression to save space on your backup media, the ability to select what you want to backup and/or exclude from backup, backup scheduling, and support for different types of backup (full, incremental, differential, Copy, Daily - see for descriptions). Generally, backups are made to some kind of external media such as portable hard drives, DVDs, a network location, or tape drives. This is one of the most common methods of backing up a system, and, once set up, can work with minimal user intervention if scheduling is turned on. Some software provides partition backups - copying an entire drive or partition rather than individual files (sometimes called a mirror image), which can be convenient for restoring systems that crash (however it may be harder to restore individual files). Software costs range from free to thousands of dollars for the software, and there also may be hardware/media costs (external drives, tapes, DVDs, etc.). I personally use backup software (EASEUS Todo Backup Free - I'm planning on writing a review of it soon), and find it works very well for me.

Cloud-based backup services. The past few years have seen the introduction of reasonably-priced cloud-based solutions for backing up files. These are basically a combination of dedicated backup software with cloud-based storage. Users purchase a plan that includes cloud-based storage and the software needed to perform backups and restores. The software automatically runs backups according to a pre-set schedule (in some cases monitoring Internet usage and uploading during down times. Files can be restored to a computer from any Internet connection, which is very convenient for laptop users. In some cases, space on the basic plans may be inadequate, and extra space can drive the costs up, but there are some companies that offer a per-computer home plan.


No matter what method you use for backups, you need to have a schedule (Unless you store and access all your important files in and from the cloud). If you don't keep to a regular schedule, you can still lose lots of data even if you have a backup depending on how often you add or change files. I back up my main systems at least once a month as part of my regular maintenance schedule - in general a month wouldn't be too big of a loss for me. My office backs up daily. You may only need to back up every couple of months. But, again, the important thing is ensuring you have a schedule and keeping to it.

The best way to adhere to your schedule is to use backup software or a cloud-based service that allows scheduling. Once you select your backup files, you can go ahead and "set-it-and-forget-it", letting your computer handle backing up. This allows you to get backups with minimal work - for local backups you may need to occasionally change media, but in general you don't need to be too aware of the process (this assumes a home system - business backups are a whole different world). If you simply copy files, or use software that doesn't allow scheduling, you'll need to be more involved, actually sitting down and making the copies or starting the backup (unless you're good at writing batch files, that is). Even if you don't set an automatic schedule, you should mark days to do your backups in your calendar to ensure that you have regular copies of your data. I choose to run my backups manually because I feel it gives me more control of storage space, but I make sure that I do it monthly.


However you choose to do it, backing up your data is a very important task.  However and how often you choose to do it is up to you and the type/amount of data on your system, but remember that if you don't backup, you're taking a chance. You may not even know what is vulnerable until your system crashes and you lose important files. Feel free to discuss further in the comments.

Wednesday, February 27, 2013

Building a Box - Dual-Boot

While I am generally a Windows user, I have always been interested in Linux. I made an attempt to install Red Hat on my PC in the late 90s (but had too many driver issues and couldn't get x-windows to work properly so I decided to wait a while). In early 2010 I looked into it again and discovered Ubuntu. After playing with 10.04 and subsequent versions in VirtualBox for a while, I restored an old laptop using 11.04. After that, I was hooked, and I have had some version of Ubuntu running on a PC ever since. Once I was sure everything was stable on my new machine (I really have to give it a name...), I grabbed the latest version (12.10) and set up a dual-boot machine.

It actually wasn't that quick. I definitely wanted to run Ubuntu on the system, but I also didn't necessarily want to extend the boot time - I was too used to the  the 20-seconds to working in a desktop program from totally off  performance I was getting, and I didn't want to slow it down, even if it was only for the 10 seconds GRUB gives me to select an OS. So, at first, I set up a virtual machine in Hyper-V, Microsoft's virtualization software that comes with Windows 8 Professional. Due to two issues - one where SpeedStep seemed to stop working (thus locking my PC to slightly higher than stock speeds - more on that in another post), and one where Ubuntu couldn't easily access the Internet, I deleted that VM and went with VirtualBox, Oracle's free virtualization software. The VirtualBox installation, being a third-party solution that was not as deeply tied to the system, seemed to work better, however Ubuntu performance was slow, and Compiz kept crashing. I finally decided to go ahead and do what I should have done the first time, and installed Ubuntu in a dual-boot configuration. I know, some of you will argue I should have installed Ubuntu as the only OS right when I built the machine, but, in addition to needing it for work, I am a Windows user, and I actually like the way it works most of the time.


Since Ubuntu has a much smaller footprint than Windows, and since I am a light user, I made a 100 GB partition to use for the installation on my 2 TB Seagate Hard Drive. I had originally stated with 50GB but that didn't even register in Explorer, so I decided it was fine to take the full 100. Once I had the partition in place, I put the LiveDVD in the drive and restarted... and got dropped back into Windows. Since my Motherboard has a UEFI bios, and was set to Fast Boot, the default was to boot into Windows and not even check if there was a disc in the drive. I booted into BIOS and set the DVD as the main boot drive, and was able to get into the Ubuntu DVD.

One thing I like about Ubuntu is how fast it installs. When I set up my last PC to dual boot, it took about 15 minutes, and this time was no different. After answering the basic questions regarding the type of setup, the install drive, my location and language, etc.) it copied the files, and rebooted... into Windows. No GRUB. It was again an issue with the UEFI BIOS, as well as the GRUB bootloader. The Ubuntu site had some information on how to fix this (1. turn off Fast Boot, 2. Run Boot Repair from the LiveDVD) and once done, I rebooted and was greeted with the GRUB screen. I booted into Ubuntu and logged in, and everything worked fine. After the installation, I spent about an hour tweaking the OS and installing my favorite software packages, and getting the latest updates installed.


Ubuntu is a fast OS. Blazingly fast. I know a lot of this is due to it being Linux, but the Ubuntu Unity shell doesn't really slow things down. And on my new hardware, it runs very fast. At times, opening programs seems like I am just restoring minimized windows, rather than starting a program. Graphics performance is fine for my use as well. It works well with all my hardware, so I am not seeing any driver issues or having problems with graphics, sound, webcam, or WiFi.

Next Steps

My plans for the Ubuntu system are to set up a LAMP (Linux, Apache, MySQL, PHP) server and get a Drupal install running for testing, as some of our clients at work are starting to request their sites on that platform. I am also going to set up a MediaWiki installation - I manage a Wiki at the office, and I want to have a place where I can try out new extensions, tweaks, and other things without messing up the official development environment when I am just playing around because something looked cool. Other than that, I am going to keep using Ubuntu and getting used to it so I can keep my computing options open.


Sunday, February 17, 2013

Building a Box - Upgrading

In the short time since I finished building my custom PC, I have done a couple of upgrades to the components and played around with overclocking in order to improve overall system performance. In this post, I will cover the new components I have added, and I will have a separate post on overclocking in the near future.

New Components

Since the build, I have added an aftermarket cooling system to improve cooling when overclocking (see Overclocking Without Permission for details on why it was necessary), added a blu-ray RW drive, and increased cooling capacity even more by putting another fan in the case. The end result is that I can now play and backup blu-ray movies, and I can burn my HD home movies in actual High Definition without worrying that my computer will overheat when encoding the videos (that wasn't an issue when the system was running at stock speeds).

Blu-Ray Drive  

The Blu-Ray drive I added is an LG WH14NS40, a SATA 14x Blu-Ray burner that supports BDXL, M-Disc, and 3D playback. The model I purchased is  an OEM version that comes without software, but I had already purchased Nero 12, which has an upgrade available for playing and writing Blu-Ray. Installation was straightforward - I simply removed the Sony DVD-RW drive and plugged this in. 

Drive performance is good - in addition to the 14x BD, it writes DVDs at 16x, and CDs at 48x. I haven't had any issues reading discs, and burning all types of discs has been successful. I've finally been able to watch my wedding on my entertainment center TV in full HD, rather than DVD quality. Amazon usually has this drive on sale, so if you are in the market for a BD-RW, I would recommend you check it out.

Aftermarket Cooler

The Antec Khuler H2O 620 installed
Due to the inadvertent overclocking issue I had, I realized that if I wanted to overclock the computer, I would definitely need to upgrade from the stock fan provided by Intel. I did some research, and eventually ended up getting the Antec Khuler H2O 620, a liquid cooling solution. I bought it because the cooling performance looked decent from the various reviews I read, the price was good, and it doesn't take up a lot of space in my case. Because it comes with a fan on the radiator, I added the case fan that was already there to the radiator in a push-pull configuration (only because I had an extra fan - I'm pretty sure there isn't much of a cooling performance gain or anything. 

It is performing very well for me - I actually liked it enough to write a review on, which I'll put here to save some writing...
I purchased this cooler for my Intel i7 3770k processor on an ASUS P8Z77 M-Pro Motherboard. The processor was running a bit hot under load with the stock cooler at stock speeds - sustained full load usage pushed the temps to 80+ degrees Celsius. After installation, sustained full load temps (3 hours of Prime95) mildly overclocked (4.2 GHz) are stable at between 50 and 60C. Idle temps are in the low to mid 20s (C).
The unit is verWhen I purchased the components, I had always planned on doing some kind of overclocking, which is why I bought the i7 3770k as opposed to the regular 3770. The unlocked multiplier makes it easy to overclock, and y quiet in my setup - under load the system is a bit quieter than with the stock heat sink and fan. I have not experienced any of the clicking that has been reported, however my unit only has a 3-pin connector, so I assume the pump is constantly running at full speed. It also fits well in my case - a CoolerMaster Storm Enforcer Mid-Tower. The stock install doesn't block anything important on the Motherboard, but if you decide to run a push/pull fan configuration, you may cover something. 
As many others have reported, the instructions leave a lot to be desired - I would have given this 5 stars if the instructions were better. I was able to get through them without an issue, but it took several looks and a couple of re-dos for the screw adapters. I definitely recommend looking for some YouTube videos or other (non-Antec) online guides before you install the unit. One thing to note - for my Socket 1155 MB, I used the holes for 1156, and everything lined up fine. I would recommend using a magnifying glass during the install - the mounting bracket for the back of the MB has markings indicating where the barrel nuts go for each MB type.
I had also looked at some air cooling, but my main concern was the amount of space it would use in the case. As you can see from the picture above, I have plenty of space available with the Khuler 620 in place.

Future Plans

The PC is working well now, but there are still a couple of things I would like to eventually do before I officially call it "finished." As I have mentioned before, I want to get a discrete video card, rather than using the on-chip Intel graphics. Since I am probably going to go for something in the high-end bracket (and since the system performs adequately enough for my current use with the Intel graphics) I am taking my time in selecting a card. I am also considering getting a sound card, however this may be dependent on the video card I get because the ASUS P8Z77 M-Pro motherboard is microATX and has a limited number of expansion slots, so I may not have room for it.

If anyone has thoughts on video cards, feel free to leave me a comment.

Wednesday, February 13, 2013

My Favorite iPad apps

I've been using an iPad since late 2010. In that time, I have come across several apps that I really enjoy using, or find very useful. Here is a list of my 10 favorite iPad apps.

  1. The Simpson's Tapped Out. I've been a Simpson's fan since they were a short on the Tracy Ullman show, so this one feels like it was written for me. The premise of the game is that Homer is playing a similar game on his myPad while at work, and, by ignoring his duties, accidentally destroys Springfield. You have to put it back together, adding buildings and characters by earning money and doughnuts (mmmmm, premium currency) by assigning the characters tasks. The game also has quests that allow you to add new characters and buildings. While it is a simple gameplay - give characters tasks and build things - the content is amusing and brings in several themes from the show. They have also had several events where you can get special items for a limited time. These include Halloween, Thanksgiving, Christmas, and Valentine's day. Like other similar games, you can purchase additional "premium currency" for real money to either speed things along or buy special items. That is actually the main downside - if you're not careful, you can spend a lot of money.
  2. Jurassic Park Builder. Very similar to The Simpson's Tapped Out, in this game you are in charge of building Jurassic Park. The game features lots of different dinosaurs, including the T-Rex Stegosaurus Triceratops, Troodon, Brontosaurus (yes, even though it's not real), and others. They have also recently added an undersea "park" with ancient sea creatures. In addition to setting up the park, characters from the first two movies (they include Ian Malcolm's daughter in the game) give you tasks to complete to earn coins and cash (this game's premium currency).
  3. Game for Cats. The name says it all. A "game" that has a mouse running around (or a butterfly, or a laser pointer) on the screen so a cat can hit it. It even keeps score. While I like the idea and execution of this app a lot, my cat just sits and watches the motion most of the time... he won't try and hit the screen.
  4. MyFitnessPal. The iPad version of the Web site, MyFitnessPal allows you to track food and exercise on your iPad. I actually like the way it works better than the Web site - adding food shows the calories right on the add screen, as opposed to the online interface. The graphs and charts are also easier to access and view.
  5. Sweet and Spicy. An app for all the lovers of Indian food, Sweet and Spicy has thousands of recipes. There are plenty of search and filtering options too, including regional, vegetarian, non-veg, festival, 30-minute, and several others. The recipes are user-submitter, and they all have comments and ratings from other users. Many recipes have videos as well. The app allows you to save your favorites for quick access. You can also enter ingredients you have on-hand and see what you can make with them.
  6. Desktop Connect. As a general "tech guy" and a system administrator on more than one server, I find that  need to be able to remotely connect to the servers often enough to warrant having a remote app on all my mobile devices. When I first started using the iPad, Desktop was one of the few choices. I purchased it, and do not regret that decision. Not only can you save connections for frequently used servers, it will scan the local network and show you any machines with remote connections available. It supports both VNC and RDP, so it can connect to pretty much everything. It has an "easy connect" feature that lets you use a gmail login to connect via VNC, but it ends up installing a VNC server, so I just used the direct VNC connection, and haven't really tried the gmail method.
  7. Nook. I have a lot of Nook-based e-books, and the iPad client lets me do everything I need. It syncs with other devices and remembers where you left off. It also has a dictionary and lets you highlight and set bookmarks.
  8. Angry Birds. Actually, this is three apps... Original, Space, and Star Wars. It's a great time waster, and the different puzzles they set up are great (though sometimes frustrating...). My only wish is that the games could communicate across platforms so if I get some levels done using my phone, it would transfer to my iPad.
  9. Chrome. Until I got my Surface RT, Chrome was the default browser on all my devices. It syncs pages and favorites across platforms and devices, which makes managing bookmarks and history very easy. It works as well as Safari on the iPad, and I feel it is a bit more elegant (although I can't exactly explain why...).
  10. Saavn. This is an app from a Web site of the same name that plays Indian music. There are thousands of songs available for streaming, and there are also tons of playlists (both official and user-created). Saavn also has a "surprise me" feature that puts together a random playlist of 15 songs.
While most of these aren't required or "can't live without" apps, they definitely make my iPad a much more useful device for both productivity and fun. I wouldn't hesitate to recommend any of the apps above to anyone with an iPad. Feel free to leave me a comment if you have a "to die for" app that you think I'd like. Note that I have a first gen iPad, so photo/video shooting apps won't really work for me...

Saturday, February 9, 2013

New Ecosystem

With the replacement of my once-great Samsung Stratosphere* with an HTC Windows Phone X, my business computing is now entirely under the Microsoft umbrella. Windows 7 at the office, Windows 8 at home (desktop and laptop), Surface RT for the tablet/netbook. Now if Redmond would only send me the shill checks they promised...

But seriously, this is the first time that I have had all my eggs in one basket - I would generally have one "odd man out" device, if not more. In fact, until recently, my standard trio for business trips was my iPad, my Windows 7/Ubuntu dual-boot netbook, and my Stratosphere. I'll be doing some posts on living in the MS world (like I haven't already...) and how it affects my work, leisure**, and other computing activities.

*more to come on this story - quick version is that Verizon took a great phone and completely ruined it with an OS update.

** My iPad needn't worry... there are enough iPad-only apps that I got used to using on that platform that I am not giving it up completely (although if they release The Simpson's Tapped Out on Windows 8 it could change things...)

Wednesday, February 6, 2013

Overclocking Without Permission

Sorry for the long delay between posts this month. I really want to try to get to posting a couple of posts a week, but Real Life keeps popping up and using up my time. In any event, I am going to try to get back to that frequency starting now. I have a lot of ideas for posts, and am going to start actually typing them up. I'll start off with something I promised a while ago.

As I mentioned in the post about my custom PC build, I had an issue with the system crashing randomly. It only crashed twice - once when I was running the Windows Experience Index, and once overnight. This worried me because it was a brand new system, and everything should have been in tip-top shape - unless I screwed something up (or received a faulty piece of equipment). I was really hoping I wouldn't have to redo all or part of the build. Luckily, I enjoy troubleshooting almost as much as I like building PCs, so I put on my thinking cap (do they still make those) and started working.

When I checked the event logs for the second crash, I saw that the system crashed at 3:02 AM. Since I wasn't actively using the system then, it really puzzled me... while the system can do some automatic tasks overnight, what could it do that makes the system crash? The event logs also indicated that it was a hardware issue, but didn't specify which hardware caused the crash. Since I had to work, I put the minidump files onto a flash drive and left for the office.

While I was at work, I went to the ASUS site and downloaded all the updates available for my motherboard, as well as a few utilities. I also reviewed the minidump files, but they didn't really give me much information - it looked like a different module crashed each time. This made it look even more like a hardware problem to me, but I still wasn't sure.

When I got home, I installed the updates as well as some of the additional ASUS utilities. I also installed Who Crashed, a utility that reads minidump files and translates the contents of the file into plain English. Analyzing the logs in Who Crashed didn't give me any additional information as to the root cause, but that was due to the nature of the crashes. Since the crashes seemed like they were probably hardware related, I decided to do some tests.

The ASUS utilities included ASUS PC Diagnostics, a program that includes a "stress test" for the CPU, system memory, and the Video system. I decided to start with a CPU test. The CPU test forces the processor to calculate Pi and uses 100% of all available cores. It runs for as long as you want it, but defaults to1 minute. I ran the one-minute test, and the system only made it through 45 seconds before crashing. While it looked like the CPU was the issue, I ran the memory and video tests to confirm. Both test ran without any issues. It seems that I had an issue with the CPU.

While I wasn't sure what the exact issue was, I started to suspect it was a heat issue. This was mainly due to the fact that under basic use or idling, the computer didn't crash. Both minidump files showed a crash while the system was performing maintenance that could involve the CPU. I tried running the CPU test while running the ASUS AI II suite's monitoring module, but the diagnostic test turned off the module before running the test. The computer made it through abut 30 seconds before crashing, and this time the BIOS gave a CPU temperature warning when the computer restarted. This confirmed the heat issue, but I still wanted to see what was going on. This lead me to discover the root problem.

I really wanted to watch the CPU temperature while I ran a stress test, so I download Core Temp, a utility I have been using for a while. I installed the program, and watched it while I ran the stress test. The system crashed after about 40 seconds, and Core Temp showed that it definitely was a temperature issue. The core temperatures increased to over 105 Celsius (the TJ Max - or the point where the processor will start to throttle or shut down to avoid damage) very quickly. Core Temp also inadvertently showed me the root cause of the problem. It seems my processor was overclocked.

Core Temp Window (image from Core Temp Web site)
In addition to showing the core temperature, Core Temp also shows other processor information. One item is the frequency, which, in my case, showed that the processor was set at around 4.2 GHz. Since my processor's stock speed was 3.5 GHz, and it had a turbo speed of 3.9, this was weird. I went in to the ASUS AI II app and found a module that showed the processor settings. This confirmed that my processor was overclocked. I used the utility to change the setting back to stock, and then ran the stress test again. It made it through 1 minute without an issue, and the temps didn't get above 65 C. I ran a 10 minute test with the same result. Now the question was "how did this happen?"
While I am not exactly sure what happened, I have narrowed it down to the fact that I like to poke around with settings, and at some point, I had clicked a button in the ASUS AI II software that caused the system to automatically overclock*. Since I had just been poking around, I hadn't noticed, and this is what caused the system to crash under high processor loads. I made sure that the stock settings stuck, and then purchased an aftermarket cooling system, and haven't had random crash issues since. I have subsequently overclocked my system on purpose, and it still is stable as a rock.
The main lesson I learned in all of this is that, when I am poking around trying settings, make sure to find out what the settings do before I try them (especially if they mess with core system settings).


Saturday, January 19, 2013

Lose Weight Now... Ask Me How

Disclaimer: I am not a doctor or health care professional. Before beginning any diet or exercise program, please check with your doctor or other health care provider as they can advise you much better than some random blogger.

With the new year having come and gone, I'm sure there are a lot of people who made and are working on keeping the resolution to lose weight, get fit, or stay in shape. This can be a hard one to keep (I've had a few experiences with this one...) but, with the help of technology, maybe not as hard as it used to be.

Ask any doctor for the best way to lose weight and keep it off, and they will say "eat less and exercise." It really can be that simple - if you burn more calories than you need to sustain your weight, you will get lighter (though I do realize that there are several other factors that can have an effect on weight loss for many people, such as genetic predispositions, thyroid issues, etc. Again, it I best to consult with your health care professional before beginning any weight loss program). The problem is making sure that you actually burn more than you take in. When I have tried to lose weight in by eating less in the past, I sometimes wouldn't see a change - mainly because while I thought I was eating less (and/or exercising the surplus away), but I actually wasn't.

Since this is something that I am interested, I'm starting a new category, fitness, where I will write write posts about ways technology can help you start, track, and maintain a healthier lifestyle. Today I'll cover online food diaries, a great way to monitor calorie intake for a weight loss program.

Online food diaries

One way to ensure that you lose weight is to keep a food diary. In the past few years, several free online resources have cropped up that help you do just that. These sites all work in a similar fashion. Users create an account and provide some basic information (current weight, target weight, how many pounds they want to lose each week) and the site sets goals (calories per day, exercise calories, etc.) based on this information. They provide the users with diaries to track daily food intake and exercise, and many also offer discussion forums and other social media functions like the ability to friend people and comment on achievements.

All the sites I have seen have pretty extensive databases of foods (both a "standard" database and user-submitted entries) and they allow you to add items that don't exist or are not correct. They also allow you to create custom recipes, so you'll finally know exactly how many calories your homemade Lasagna has.  Many of the sites also track other items such as fat, sodium, vitamins, iron, etc. - all based on the standard USDA Nutritional Information requirements. This can be handy if, in addition to losing weight, your doctor advises watching intakes of things like fat, sodium, or cholesterol. Several of the Web sites have iOS, Android, Windows Phone, and Windows 8 apps to allow tracking no matter where you are, and also allow the scanning of UPC labels to quickly enter packaged foods.

But do they really work? The short answer is "yes." The longer answer is "Yes, but you need to be motivated and use the available resources." I have actually lost 38 pounds since September using one such site - and exercising about 4 times a week. Others have had good luck as well. The hardest part for me starting out was remembering to log my food and exercise. Other people I know who use MyFitnessPal say they feel that it gives them too few calories per day. While that can be tweaked by changing weekly weight loss goals, I can see how it can be annoying.

Friday, January 4, 2013

Building a Box - The Build

Meet my new PC
Well, this is progressing much faster than I thought. I am actually typing this on the box that the series is about. The combination of Black Friday/Cyber Monday sales, and Christmas presents made it possible to get most of the parts, and a great sale on the processor at Microcenter (more than $100 off list, and at least $60 less than the best sale I found online) sealed the deal.

Final specs

As my uncle pointed out in the comments of the last post, I didn't really talk about the Power Supply or the case. In order to support future expansions, I decided to go with a 600 watt power supply. As I add graphics cards and other components, this will ensure that I have ample power available. I also selected a mid-tower case. This has plenty of space for HDDs, SSDs, optical drives, and, most importantly, cooling fans. 

Parts List

If you read my last installment, you're aware of the general specs of the machine. Here is the list of specific parts I used in the system:
I am currently using the i7's Intel 4000 graphics and the P8Z77's integrated Realtek sound processor. I plan on upgrading the graphics to a discrete video card in the near future. I am also thinking about getting a Blu-ray burner so I can burn my vacation videos in full HD.

The Build

The actual build was a relatively painless process - I went from a pile of parts and a couple of DVD's to a working Windows system in about an hour and a half. While I haven't actually built a whole PC at once before, throughout my career I have replaced each and every standard computer component except for a Power Supply. This prior experience certainly helped make the build go smoother, especially when it came to figuring out what plugs in where on the motherboard. The biggest problem I had was that I left one of the motherboard power connections unplugged so the system didn't boot with the first press of the power switch. A few minutes of poking around fixed that. The most surprising part of the build was how quickly Windows 8 Pro installed - it only took about 15 or 20 minutes to have it up and running.

Here are some pictures from the build:
Case with PSU
Motherboard Installed, awaiting processor.
All components plugged in and ready to go!
Waiting to start!

Initial Performance Notes

The computer is pretty fast. A cold boot (from completely off to working in an Office app) takes about 25 seconds - a time similar to my Sony Vaio Ultrabook. The only set of benchmarks I have run is the Windows Experience Index.

Windows Basic System Information 
 The overall rating of 5.9 doesn't really tell the story, as it is based on the lowest score - desktop graphics, which is due to the fact that I am currently using the on-chip graphics. When you look at the breakdown below, you see that the processor, memory, and primary hard disk performance are 8.0 or better. Actually the desktop graphics score is surprisingly good for on-chip graphics considering that the moderately-priced nVidia discrete card in my other desktop only scores 6.5 for desktop (and ties the Intel 4000 for gaming/3D performance) .

More details from Windows
I plan on running some more comprehensive benchmarks, and will cover that process in a separate post.


Other than the missed power plug, I didn't have any problems with the build itself. Once the system was up and running, however, I did run into an issue. The system had an "unexpected error" of the Blue Screen of Death (BSOD) variety. Windows said it may be a hardware problem, but wasn't specific. What concerned me about the error is that it happened at 3:00 AM on Jan 2, when the system really wasn't doing anything. After doing some troubleshooting, I figured out and solved the underlying problem - processor heat and inadvertent overclocking. It's an interesting story that I will cover in another post, but the basic story is that I accidentally clicked a button in the ASUS AI software that overclocked my processor to 4.2 Ghz, and the stock heat sink couldn't keep up under heavy load.

Getting it Ready

After figuring out the BSOD issue, I spent the last few days getting all my "standard" software installed and my files moved from the old machine. Since my old system is no slouch, I am cleaning it out and giving it to my wife, who needs a desktop she can use for development. I also opened the case a few times to tweak the setup - mainly re-arranging the components and using the cable management to hide the wires as much as possible.

What's Next

The next steps are going to be researching a decent cooling system - it still gets a little hot for my liking under load even if it doesn't crash, and finding a good graphics card. If you have any ideas for either of those, or just general thoughts on the build, feel free to leave me a comment.

Tuesday, January 1, 2013