Wednesday, February 5, 2014

Linux Screencasting or Livestreaming

The goal of this post is to inform you about a GUI screencasting piece of software that I use for both recording my desktop gaming sessions as well the livestreaming them to sites like Twitch.tv and Hitbox.tv. I will show you how to compile and install the latest version from GIT as I feel it has some improvements within the code that are not available in the ppa version of the software. Compiling code shouldn't scare you away as it will not harm your system in anyway if you follow step by step. Many times it is necessary to compile software in Linux so that you can have the latest version of a package because the maintained versions within the official repositories are not up to date.

There are various GUI software packages out there for Linux users in order to capture their desktop so they can create a screencast of how to complete a particular task in Linux or to create a video showing how to get past a certain level in a game like gtkrecordmydesktop, istanbul, and kazam, but when I attempted to use any of those I would always have issues with synchronization of the video and audio. Either it would be out of sync from the start or it would start to become out of sync over time. Also none of those packages allowed me to livestream the action in realtime (as close to real time as you can get). I came across a package called SimpleScreenRecorder.


Simplescreenrecorder is a GUI screencasting software. It's very similar to the other pieces of software that I listed above in that it uses FFMPEG at it's core but the key difference is that it has built in synchronization coding as well as the ability to livestream to sites like Twitch or Hitbox.tv. He has a PPA if you want to just try that version of the software but that version does not contain the ability to set the keyframe interval (which Twitch and Hitbox expect a value of 2 seconds for every keyframe) and it also doesn't contain the improvements he made to the syncronizer and scaling feature. Let's get right to it and compile SSR from GIT.

The README.md file within the source files contain all the information you'll need to build and install it for your Linux Distribution except for Ubuntu 13.10, it's missing 1 library so we'll walk through those steps together right now. These steps are for a 64bit architecture so if you're running a 32bit version of Ubuntu 13.10 than your steps may vary slightly but I believe Maartin has accounted for both 32bit and 64bit in his build and compile script so don't give up now.

Let's get started, in Ubuntu 13.10 64bit you can open the terminal by holding the alt key and pressing F2 which will bring up the dash, then simple type in terminal and then click on gnome-terminal to open that application.
First we need to install the dependencies. *NOTE* Ensure you scroll inside the code box to copy and paste everything required.
sudo apt-get install git build-essential cmake pkg-config qt4-qmake libqt4-dev libavformat-dev libavcodec-dev libavutil-dev libswscale-dev libasound2-dev libpulse-dev libjack-jackd2-dev libgl1-mesa-dev libglu1-mesa-dev libx11-dev libxext-dev libxfixes-dev g++-multilib libxext6:i386 libglu1-mesa:i386 libxfixes3:i386
Let's first make sure you're in your /home directory. The cd command should get you there, just type in cd and hit enter in the terminal and then you can check by typing in pwd and hitting enter. It should show /home/youusernamehere.
Now we need to get the code from his GIT project
git clone https://github.com/MaartenBaert/ssr.git

Now let's switch to his glinject-next branch with has the improvements to the code and the ability to set the keyframe interval as required by Twitch and Hitbox. First we'll need to cd into the newly created ssr folder (run the next 2 commands separately, 1 line per command)
cd ssr
git checkout glinject-next
Now we can run the simple-build-and-install script from the ssr folder.
./simple-build-and-install
If you receive the following error than the 32bit libraries are not linked correctly for a 64bit system and we need to create some symlinks but it depends on what type of graphics drivers you're running on your system.
checking for XOpenDisplay in -lX11... no
configure: error: required library missing
If you're running open source drivers (intel, radeon, or nouveau) than you need to create the following symlinks. First cd into the appropriate directory and then create the 2 symlinks (run the next 3 commands separately, 1 line per command).
cd /usr/lib/i386-linux-gnu
sudo ln -s libGL.so.1 mesa/libGL.so
sudo ln -s mesa/libGL.so libGL.so
If you're running proprietary graphics drivers (fglrx or nvidia) than you need to create some additional symlinks which you can find out about by reading the SSR source README.me file. It's located within the ssr folder that you cloned from GIT.

For all drivers we need to create these additional 4 symlinks. You should still be within the /usr/lib/i386-linux-gnu/ directory when you create these 4 (run the next 5 commands separately, 1 line per command) *NOTE* Ensure you scroll inside the code box to see all required symlinks required.
sudo ln -s libGLU.so.1 libGLU.so
sudo ln -s libX11.so.6 libX11.so
sudo ln -s libXext.so.6 libXext.so
sudo ln -s libXfixes.so.3 libXfixes.so
sudo ldconfig

We can now cd back into the SSR directory so that we can run the simple-build-and-install script again now that we have properly linked to the 32bit libraries (run the next 2 commands separetly, 1 line per command)
cd ~/ssr
./simple-build-and-install

That should have successfully run and you should see the ending lines that appear like the following
See any operating system documentation about shared libraries for
more information, such as the ld(1) and ld.so(8) manual pages.
----------------------------------------------------------------------
make[2]: Nothing to be done for `install-data-am'.
make[2]: Leaving directory `/home/vm1/ssr/build32/glinject'
make[1]: Leaving directory `/home/vm1/ssr/build32/glinject'
Making install in scripts
make[1]: Entering directory `/home/vm1/ssr/build32/scripts'
make[2]: Entering directory `/home/vm1/ssr/build32/scripts'
make[2]: Nothing to be done for `install-data-am'.
make[2]: Leaving directory `/home/vm1/ssr/build32/scripts'
make[1]: Leaving directory `/home/vm1/ssr/build32/scripts'
Making install in src
make[1]: Entering directory `/home/vm1/ssr/build32/src'
make[2]: Entering directory `/home/vm1/ssr/build32/src'
make[2]: Leaving directory `/home/vm1/ssr/build32/src'
make[1]: Leaving directory `/home/vm1/ssr/build32/src'
make[1]: Entering directory `/home/vm1/ssr/build32'
make[2]: Entering directory `/home/vm1/ssr/build32'
make[2]: Nothing to be done for `install-exec-am'.
make[2]: Nothing to be done for `install-data-am'.
make[2]: Leaving directory `/home/vm1/ssr/build32'
make[1]: Leaving directory `/home/vm1/ssr/build32'
Running post-install script ...
Done.
You've just successfully compiled and installed SimpleScreenRecorder from GIT using the glinject-next branch. You won't be able to use SSR and encode (capture) using h264, aac, or mp3 yet though. For those to be applicable as a choice when you choose your encoding settings within SSR you'll need to install the ubuntu-restricted-extras package. You would just type in the following command into the terminal. It may prompt you to accept the installation of the packages by having to type in a "y", meaning Yes. It may also prompt you again for accepting the user license for true type fonts, mp3 and aac codec's, you would merely hit tab so that "ok" highlights and then hit enter.
sudo apt-get install ubuntu-restricted-extras
*NOTE*If you're running Kubuntu, Lubuntu or Xubuntu, be sure you install it's applicable restricted-extras pacakge ie: xubuntu-restricted-extras.
This was only to show you how to install SimpleScreenRecorder. I may do a follow up on how to use it properly but Maartin already has a lot of great information on his website.

-Ubu out

Friday, November 29, 2013

Xbox 360 Controller and Steam Games (Valve's Source Engine Games)

Some may yell at me for wanting to play PC games with a controller. They all say, "PC games are meant to be played with a keyboard and mouse." Well I grew up playing console games which always had a controller. It's not due to the lack of trying but when I play with keyboard and mouse my movement is less than adequate and my fingers don't have the buttons memorized yet so I end up dieing because I didn't move away from an enemy quick enough. W, S, D, and A are the buttons normally used in PC FPS games for moving forward, backward, and strafing side to side. The mouse is used for where your eyes are looking and or where your gun shoots. So until I get better with a keyboard and mouse I'd like to play my PC games with a controller at times.

I was happy to see that Serious Sam 3 BFE (a first person shooter) had controller support. Most Steam games will denote whether a game has controller support or not. Sometimes it may say "Partial Controller Support". I am not certain what that means but some controller support is better than none in my opinion. I took a chance and bought the game. When I fired it up my After Glow Xbox 360 Controller was supported immediately upon turning on the game. This is with a default Ubuntu 12.04.3 installation, it uses the xpad module (driver) and I didn't have to install the xboxdrv module (driver) which is a userspace driver. There are some advantages to using the xboxdrv module but I won't cover those in this post.

Other games I quickly found the same controller did NOT work in, notably all the Source Engine Games by Valve. The ones I tried were Team Fortress 2, Portal, and Left 4 Dead 2 the controller did not work. I tried everything,  from opening the console in game and entering "exce 360controller" and "exec 360controllerlinux" to allowing world readable permissions on the /dev/input/event11 device node which is what the controller was plugged into but nothing was working. Well after many hours of googling and trial and error I found what finally solved the controller issue for Valve's Source Engine Games. The original solution has to be credited back to a google post just to give credit where credit is due.

First open the Steam Client and you'll notice that there is a place to click to activate Big Picture Mode, it's located on the upper right side of your Steam Client.
Click that and it will activate Big Picture Mode which is basically Steam running in fullscreen. Use your mouse and click on the little gear symbol in the upper right corner which is where all the settings are. Now click on Controller. Next you'll see the following picture and hopefully it states that your controller is detected. If not, sorry I can't help any further, you'll have to investigate why Steam can't detect your controller.
Click on edit controls and you'll be taken to this screen. 
You just go down the list, Steam shows the button on the screen in green and you click that corresponding button on your wire connected Xbox 360 controller. I am not sure whether this works with a wireless controller due to the Microsoft's wireless technology of the controller, you'll definitely need a wireless dongle from somewhere if they even make them. Once you have mapped each button then click save, you can choose to give it a name or overwrite the existing name that was there. Now you are all done.

You can now use your controller within Team Fortress 2, Portal, and Left 4 Dead 2. One thing to note was that I did have to increase the sensitivity within TF2 because the movements were really slow but other than that I am very pleased I got the controller working.

-Ubu out

Sunday, November 24, 2013

Linux Video Capture using an HD-PVR (model# 1212)

Hello there fellow tech nerds. How has everyone been doing lately? I have been thrilled with life lately and I hope you as well.

Todays post is going to cover how to capture hd video (720p) from a component video source onto your Linux computer so that you can edit it with video editing software. I do this to capture my Xbox 360 gameplay. I use a microphone and audacity to record commentary separately. I then edit the video files captured from the HD-PVR and the audio commentary together using Kdenlive, then render it and upload the videos to YouTube. I do this to help others with certain parts of a game or just provide general tips about a particular game. So let's get right into it.

First you'll obviously need an Hauppauge HD-PVR but it has to be model number 1212 (the wiki states that model# 1445 is also supported but I don't see that as a model# on the Hauppauge Support page), the newer ones (HD-PVR2) don't have linux drivers so there is currently no way to capture from them. Luckily if you're using a recent Linux distribution the driver that allows this to work is built right into the Linux kernel. Any kernel 2.6.30 or above and the driver is included within the kernel. I am running Xubuntu 12.04.3 with kernel 3.7.0-030700-generic. You may possibly need a Windows computer to upgrade the firmware on the HD-PVR which is done by running a Windows executable file from Hauppauge's Support Page which installs the driver but it also updates the firmware within the HD-PVR. The latest firmware is preferred, it's all explained HERE.

Step 1 (determine current firmware on your HD-PVR)
Open a terminal window and plug in your HD-PVR preferably into a USB2.0 port (I don't think USB1.1 is fast enough. I had issues using a USB3.0 port in which it would stop recording by itself so try to avoid if you can or try it out and see for yourself), type in
dmesg
Mine returns the following information
[6307.550581] usb 1-3: new high-speed USB device number 13 using ehci_hcd
[ 6307.727776] usb 1-3: New USB device found, idVendor=2040, idProduct=4903
[ 6307.727779] usb 1-3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[ 6307.727781] usb 1-3: Product: Hauppauge HD PVR
[ 6307.727783] usb 1-3: Manufacturer: AMBA
[ 6307.727784] usb 1-3: SerialNumber: 00A49D0D
[ 6307.755504] hdpvr 1-3:1.0: firmware version 0x1e dated Mar  7 2012 08:25:15
[ 6307.755507] hdpvr 1-3:1.0: untested firmware, the driver might not work.
[ 6307.869496] hdpvr 1-3:1.0: magic request returned 8
[ 6307.878237] hdpvr 1-3:1.0: config call request for value 0x1700 returned 1
[ 6307.887215] hdpvr 1-3:1.0: config call request for value 0x1500 returned 1
[ 6307.922052] hdpvr 1-3:1.0: config call request for value 0x1200 returned 1
[ 6307.931548] hdpvr 1-3:1.0: config call request for value 0x1300 returned 1
[ 6307.940780] hdpvr 1-3:1.0: config call request for value 0x2900 returned 1
[ 6307.950518] hdpvr 1-3:1.0: config call request for value 0x2a00 returned 1
[ 6307.958510] hdpvr 1-3:1.0: config call request for value 0x2b00 returned 1
[ 6307.967748] hdpvr 1-3:1.0: config call request for value 0x2c00 returned 1
[ 6307.977615] hdpvr 1-3:1.0: config call request for value 0x2d00 returned 1
[ 6307.996731] hdpvr 1-3:1.0: control request returned 4
[ 6307.997589] hdpvr 1-3:1.0: no valid video signal or device init failed
[ 6308.007835] hdpvr 1-3:1.0: control request returned 1
[ 6308.025559] hdpvr 1-3:1.0: control request returned 1
[ 6308.025561] hdpvr 1-3:1.0: allocating 64 buffers
[ 6308.052476] hdpvr 1-3:1.0: device now attached to video1
[ 6308.052495] usbcore: registered new interface driver hdpvr
The hdpvr module should have autoloaded once you plugged in your HD-PVR, if it didn't we can load it manually by issuing
sudo modprobe hdpvr
Nothing will appear to happen if all went well but to verify it's loaded you can issue
lsmod | grep hdpvr
That should return the following info
hdpvr                  32700  0
v4l2_common            21458  2 s2253,hdpvr
videodev              130085  5 s2253,hdpvr,uvcvideo,videobuf2_core,v4l2_common
If it didn't you'll have to figure out what's wrong with the hdpvr module within the kernel you're using and this tutorial won't cover that sorry.
So according to my dmesg output and the HD-PVR MythTV wiki I am running the latest firmware already. This is a good thing. If you're not running a firmware dated Mar 7 2012 then I suggest updating the HD-PVR firmware because it can fix various issues as well as fix color and saturation issues with previous firmwares. I am not sure this would work within a Virtual Machine so if you don't have a Windows computer see if you have any friends or family that would allow you to use it. You would just need to bring the HD-PVR with you, along with it's USB cable and if they had internet you could download the latest driver that linked to earlier, hook up the HD-PVR and run the .exe driver installer and it will update the HD-PVR with the latest firmware. If you can't for whatever reason update the firmware it's ok, you can still move forward with the tutorial just note that if issues arise it may be related to the old firmware on the HD-PVR. NOTE: Certain kernels (kernel lower than 3.3) and firmware combinations may also result in color and saturation issues as well, those issues can be read about HERE.

Step 2 (setting some sane defaults for the HD-PVR)
The way we set the module (a module is equivalent to a driver in Windows) defaults at least within Ubuntu is by creating a *.conf file and storing it within /etc/modprobe.d/. The *.conf file will be read by the kernel when it loads the hdpvr module. The settings that can be set are all listed HERE. Create the *.conf file
gksudo leafpad /etc/modprobe.d/hdpvr.conf
You can name it whatever you want but I named it hdpvr.conf so that I know what the conf file is for. I used leafpad but you can use whatever GUI text editor that's installed on your system. I run Xubuntu so the default editor is leafpad, yours may be gedit if you run straight Ubuntu.
within the file paste the following
options hdpvr hdpvr_debug=1 video_nr=1 default_video_input=0 default_audio_input=0
I enabled the least log info possible while still showing some debug info (useful for troubleshooting). Since I have a webcam always connected I made the HD-PVR video device node 1 (will show up as /dev/video1). I want to capture the component input (versus composite-yellow, white, red) and finally I want to capture the audio from the rear RCA ports NOT the front ones. You can set those settings to match your system and situation. In order for the settings to take effect without having to reboot your machine you can issues the following to reload the hdpvr module which will then use your settings.
sudo rmmod hdpvr
That will unload the module
sudo modprobe hdpvr
That will load it again using the settings from the *.conf file you created.

Step 3 (capture some video)
Unfortunately for now there is no nice little GUI (Graphical User Interface) application that captures the HD-PVR video stream although I am in talks with the developer of SimpleScreenRecorder to see if he can incorporate capturing the HD-PVR with his application. So for now it's a pretty basic command in which we have to use the terminal. For this to work your user that you log in as needs to be part of the video group. I'm sure you can google how to achieve this as it's not hard but this tutorial won't cover that. So open a terminal window, then we want to change directories to where ever you want the recording to be saved. Ensure there is an ample space free within this folder since capturing HD video takes up a lot of space. It's around 500 megabytes (that's .5 gigabyte) of space for around a 10 minute video. That's done with
cd /home/username/Videos/
You obviously need to change the username to be YOUR username and ensure the full path is writable by you. So I have a Videos directory located in my home directory which I have write permissions to. Normally any folder within your entire home directory is writable by you. The command to capture the video is
cat /dev/video1 > video.ts
The Linux cat command concatenates and lists files. Basically we're using cat to spit out whatever is at /dev/video1 which in this case will be the video stream from the HD-PVR. The little forward pointing arrow ">" is the linux command called redirect. It redirects input somewhere else. So in this case it's taking the video stream from the cat command and redirecting it into a file called video.ts. That's pretty much it, whenever you want to stop recording you simply hit the control key plus the letter c (ctrl+c) which tells it to cancel the command.

Congrats, you now have a video file called video.ts which contains whatever was being displayed from your video device. In regards to the video and audio specifications the HD-PVR can record up to 1080i. Personally I prefer 720p over 1080i since the p means it's progressive which can lead to less flickering. It boils down to what resolution you're inputting into the HD-PVR, it has a passthrough, meaning whatever you're inputting in the HD-PVR that same signal will get passed out to a TV to view. At that same time the video is encoded within the device using h264 codec for the video and aac codec for the audio. The encoding information for the HD-PVR can be read about HERE, it's all in the 2nd and 3rd paragraph if you're interested in learning about the bitrate, resolutions, codec etc etc. So in my example I play my Xbox 360 as 720p so the video.ts file contains an hd video which is HD quality (720p). Now go and create some masterpiece videos and showcase them on YouTube for the world to view.

Here's a sample of my work all done using Xubuntu Linux

-Ubu out

Sunday, November 17, 2013

Transfer Ubuntu OS installation to a larger Hard Drive (2TB or smaller)

Skip to "Let the tutorial begin" if you don't want to read any back story.

First I'd like to say sorry to those that follow this Blog, I have not posted in a very long time. I've been really hip deep in my YouTube Channel and gaming (both in Linux and on Xbox 360) so I haven't had much to blog about in terms of technology goes. I am hoping to get a post up about SteamOS and other technology related stuff very soon but let's start with this post for now.

I was debating what to title this post exactly since theoretically it could be used to transfer other Linux distributions as well and heck, it may even work to transfer a Windows OS or even OS X installation to another hard drive but I figured I would only cover exactly what I did since I know it works. This post will go over the steps I took in order to successfully change from a small IDE 40GB hard drive to a larger SATA 160GB hard drive AND at the same time changing ALL my computer hardware as well.

A little backstory first. I built my 4th computer in 2007 and believe it or not I have been using that ever since. With my recent YouTube adventures and getting into more PC gaming the computer started to really show it's age as I couldn't play certain games and rendering videos for YouTube took forever and a day. So I finally decided I needed to upgrade everything. I was planning on saving my money for Black Friday or Cyber Monday but a deal for a used computer sprang up out of no where and for a really good price I might add. For a mere $200 I could get a completely new tower. The main specs of the tower are as follows:

Sentey Modtower case with 7 fans
AsRock Extreme6 FM1
AMD A8-3870k 3.00Ghz
G-Skill Sniper 2133Mhz DDR3 RAM (2x4GB totalling 8GB)
XFX HD5750 1GB DDR5
OCZ Synapse Cache SSD 64GB
400w PSU that powers the motherboard and GPU
380w PSU that powers everything else

My current Xubuntu 12.04.3 was currently on a 40GB IDE hard drive and since the new motherboard didn't have any IDE's ports I would need to figure out what to do to transfer my operating installation to my new used hardware. I found a 160GB SATA hard drive in another computer of mine, moved all the data of it onto my Western Digital My Book World Edition (a NAS basically) and I was ready to move my Xubuntu installation from the small 40GB drive to the larger 160GB drive.

Let the tutorial begin
Things needed
-A computer to create the live usb sticks
-You'll need a minimum of (2) 1GB flash drives (to run live usb versions of clonezilla and linux-secure-remix from) OR an optical disc such as cd-rom or a dvd....assuming your computer has an optical drive. You could use the same 1GB stick for both clonezilla and linux-secure-remix BUT it will take more time since you have to use linux-secure-remix in between using clonezilla.
-A storage location large enough to backup the images of your partitions (network storage OR local external hard drive)
-Clonezilla (I choose amd64 because the computer I was running the live usb stick from had a 64bit chip. If you aren't sure it's ok to choose the i686pae version)
-Linux-Secure-Remix (choose linux-secure-13.04-64bit if the OS you're moving is a 64bit OS or choose linux-secure-13.04-32bit if the OS you're moving is a 32bit OS)

Alright, so I trust you downloaded the software you'll be using from above and we're ready to get going.

Step 1 (install new hard drive)
Install your new hard drive (this is the hard drive you're transferring your currently installed Operating System TO) into the computer which will be your final computer hardware you're going to settle with. In my case it was a completely different computer tower BUT your may only be transferring your OS to a new hard drive and not changing any hardware so you would install the hard drive into your current computer that you want the larger drive in.

Step 2 (live clonezilla media creation)
Using your current computer operating system, we'll use it to create the live usb or livecd of clonezilla. Clonezilla has some tips for creating it HERE. You can not simply copy the clonezilla ISO to a usb stick or to a cd-rom as data, that won't work. You need to use software that writes the ISO to the usb stick or the optical media as well as making the usb stick or optical media bootable. Meaning, the computer BIOS sees it as a bootable device and the computer boots the usb stick or optical media versus the internal hard drive that's in the computer you're using. Once your done creating your live media of clonezilla shut down your computer

Step 3 (backup partitions to image files)
If you made a live cd or live dvd, put it in the optical drive and then turn off the computer which contains the hard drive that you want to transfer FROM. If you created a live usb stick, plug it once the computer is off. Turn the computer on and activate a boot menu if you can (it was F11 on mine), this will allow you to choose which device to boot to, in the case of a live cd, choose the correct optical drive. In the case of a live usb stick, choose the applicable usb stick. It should boot into clonezilla, hit enter to choose the defaults, choose the proper language and hit enter, I clicked "don't touch keymap" for the keyboard setting and hit enter. Choose Start Clonezilla, hit enter. Choose the device-image option (first option), hit enter. The next screen is where you choose where you want to backup your images (what they refer to as being /home/partimag) (NOTE: do NOT choose the new hard drive to store your images onto because it's going to be formatted later on), in my case I was using a 200GB external usb hard drive so that option is local_dev, hit enter.  Then choose the folder or directory where the image will be saved to. Click beginner mode as that's the easiest and worked just fine for me. The next screen I choose saveparts option because I had a / partition and a /home partition I need to backup. Then choose the partition you want to create an image of by arrowing onto it and hitting the space bar (it will put an asterisk to signify that's the partition you're backing up), name it appropriately and hit enter.  I can't find any screenshots so I am not certain on the next few steps but it should just be hitting enter and then hitting "y" telling it to perform the backup image creation. After it's done, you have the option of powering off, rebooting, starting over from scratch or starting over keeping the same location for storing another image. I choose to start over fresh just in case. So perform the same steps to backup your next partition but obviously this time choose the next partition you want to backup and when it comes time to name it, ensure you choose a different name because the default name will be the same as your first image backup. Hit "y" a few times and it should now create another image but this time of the newly selected partition you choose to create an image of. I only had 2 partitions so I am done with this step but if you have more partitions than keep creating images of them until you're done. Reboot the computer so it boots to your current operating system that you're using. Pfffff, that was a long step. Sorry about that.

Step 4 (live linux-secure-remix creation)
If you don't have a second flash drive just use the same one you used for clonezilla. You'll use the same steps to write the linux-secure-remix ISO to the usb stick as you used during step 1. Remember, use a method that "burns the ISO image" onto the usb stick and makes it bootable as well.

Step 5 (partition your new drive)
This step you can either boot into your current OS installation OR insert the live media which contains linux-secure-remix on it into the computer that has the new hard drive. In my case I had to boot the live media since this new computer didn't have a hard drive with any OS in it. If using your current OS installation, than use whatever method needed to partition your new hard drive. This step is important in that you need to partition your new drive the same way your old drive is partitioned, not in size, but the partition numbering. NOTE: the new partitions should be larger than what they are now that you're transferring FROM. Primary and logical partitions need to match. I choose to stick with msdos (MBR) style partitioning versus going to the new GPT partitioning scheme which is required for drives larger than 2TB. MBR works for 2TB drives and lower. Sorry this tutorial won't go into switching from MBR to GPT but I have read it can be done without data loss. In my case I had sda1 as my only primary partition, sda2 was an extended partition which contained sda5 and sda6 as logical partitions. I formatted sda1 and sda6 as ext4. If using a livecd or live usb, then boot the computer and enter the BIOS boot menu so you can choose either the livecd or the live usb stick to boot to. Once booted into linux-secure-remix (it's basically Ubuntu 13.04 with some pre-installed applications) you'll want to open the application called Gparted.


NOTE: If you have more than 1 disk within this computer, ensure you're on the correct one. Follow the steps above for partitioning and ensure the partitions are the same as your old drive you're transferring FROM. Now you're done partitioning the new drive and it's ready to accept the images of your partitions that you're transferring. If you made a livecd then put the clonezilla livecd into your optical drive prior to shutting down. You can shut down your computer as it's time to reboot into Clonezilla.

Step 6 (restore backup'd up images onto new hard drive)
If you created a live usb stick, plug it in. Turn the computer on and activate a boot menu if you can (it was F11 on mine), this will allow you to choose which device to boot to, in the case of a live cd, choose the correct optical drive. In the case of a live usb stick, choose the applicable usb stick. If you can't activate a BIOS boot menu, then go into the BIOS and change the boot order to the proper media, whether it's the livecd or the live usb stick. It should boot into clonezilla, hit enter to choose the defaults, choose the proper language and hit enter, I clicked "don't touch keymap" for the keyboard setting and hit enter. Choose Start Clonezilla, hit enter. Choose the device-image option (first option), hit enter. The next screen is where you choose where you stored your images (what they refer to as being /home/partimag), in my case it was the 200GB external usb hard drive so that option is local_dev, hit enter.  Then choose the folder or directory where the images were saved to. Click beginner mode as that's the easiest and worked just fine for me. The next screen I choose the restoreparts option because now we're "restoring a partition" to the new hard drive. Choose the image you want to restore, this will be your first image you created of your first partition, hit enter. Now choose the destination, which should be your first partition on your new hard drive, most likely sda1 if the new hard drive is your only hard drive in the computer. NOTE: ensure you choose the correct hard drive and partition as this will overwrite whatever is there with your backed up image and hit enter. There may be a few more enters or it's asking you a couple times if you're 100% positive that you want to perform this action since it's going to overwrite whatever is on the "target partition" with what's in the image file. Hit "y" if you're sure and off you're running. Once done you have just restored your first partition from your old smaller drive to your new larger drive. Perform the same steps again for anymore partitions that you imaged and then you're done restoring images onto the new hard drive. We're almost DONE. You can attempt to reboot your computer removing the clonezilla live media but I am betting it doesn't boot into your OS, I had a flashing cursor in the upper right corner and that was it. No Xubuntu for me.......YET.

Step 7 (fixing your MBR/boot loader, most likely grub2)
I realized after the fact that if we had chosen advanced options when backing up the partitions to image files and the advanced options when restoring we may not have needed this step but I choose beginner so it is what it is and we need to install grub2 to the MBR so that the BIOS passes on the hardware to a bootloader which will then boot the installed OS. This is done using linux-secure-remix. So put it your linux-secure-remix livecd or livedvd in the optical drive OR plug in your live usb stick and boot the computer to one you created. Once inside linux-secure-remix we need to run the boot-repair application


Choose the recommended repair which is what fixed mine. When it shows you a URL for where it uploaded the boot-repair log file, write it down so in case this doesn't work, you can obtain help from others and point them to the boot-repair log file which will help others figure out why it's not working. You should now reboot your computer removing whatever live media you were using and it should now boot into your Operating System. 

That's it, YOU DID IT! Congrats. I hope this was helpful to someone. Leave me a comment if I missed something somewhere or to say thank you if this helped you.

-Ubu out






Thursday, May 9, 2013

ATI Remote Wonder and Linux XBMC

Controlling XBMC from your couch can be achieved in many ways. The easiest solution would be to just get a wireless keyboard/mouse combo but us Linux users don't often go for what's easiest. We enjoy tinkering around and learning new things otherwise we'd just be using Windows in the first place. We like to innovate, at least I do. The next easiest thing to do would be to install an XBMC Remote control app on your smartphone, the Official XBMC Remote control app from joethefox is FREE within the Apple Store. There are other paid XBMC remote apps but the FREE one works just fine when tested on my iPhone 4S running iOS 6.1.  However this post will be about getting an old ATI Remote Wonder working in Linux XBMC. I believe there are a couple different variations of the remote but the one that I have is pictured below, if yours looks different then your mileage may vary.


First let me inform you what versions of everything I am performing this on.
Ubuntu = Ubuntu 12.04.2 LTS (running mythbuntu-desktop cause I dislike Unity)
Kernel = 3.2.0-40-generic i686
XBMC = 2:12.2~git20130502.1706-frodo-0precise (obtained from team-xbmc PPA)



This tutorial will mostly be done using a terminal session, if you're scared of the terminal don't be. I will try to explain what each command does and why sudo (root priveleges) are required. When you first plug in your ATI Remote Wonder usb receiver most likely the kernel will automagically load the "ati_remote" module and what that does is basically make the remote act like a mouse. If you used the large circular pad towards the top of the remote you'll see it moving your mouse and such. We don't want that so let's remove that module by issuing the following command. Sudo is required due to removing a module which interacts with the kernel but don't worry as that kernel module was only loaded because it sensed the usb reciever when it was plugged in. Normally that's a good that the kernel automagically loads modules when hardware is plugged in but in this case we don't want that functionality. Whenever you use sudo it asks for your users password, enter it and press the 'enter' key. It doesn't show you that you're entering any letters but you are in fact typing in your password.
sudo rmmod ati_remote
If it says it can't remove it because it's in use then just ignore that and continue on with the tutorial. Now we need to make it so when we reboot the machine that module doesn't automagically load, this is done by editing a configuration file. Since this config is located within the /etc/ directory and is owned by root, we'll again need root privileges so we'll be using sudo but since we're opening a GUI (Graphical User Interface) application, we want to actually use 'gksudo'. This is the command (NOTE: gedit is the default GUI text editor for Ubuntu, if you're using Kubuntu it is kate I believe. I am using Mythbuntu so the GUI text editor is actually called mousepad for me. Replace the gedit command with whatever GUI text editor your distribution uses.
gksudo gedit /etc/modprobe.d/blacklist.conf
Once the file opens in your GUI text editor we're going to add the following text in red to the very bottom of the file
#to get ATI Remote Wonder working
blacklist ati_remote
The pound symbol is used for comments and isn't read as a configuration line. Save the file and close the text editor. Restart your computer ONLY if you previously couldn't remove the ati_remote module. When you restart it won't load this time because you blacklisted it. Now we need to install lirc which stands for 'Linux Infra-red Remote Control'. We again will do that from the terminal session and require sudo because installing software requires root privileges. Type in the following command:
sudo apt-get install lirc
Choose 'Y' if it asks you if you're sure you want to install the software. If lirc was already installed and you'd like to reconfigure it you would use the following command:
sudo dpkg-reconfigure lirc
It will bring up a debconf window. Debconf stands for debian configuration, most applications within Ubuntu are of the .deb extension and is the common package management tool used within Debian based Linux distributions. Within the debconf window using the up and down arrow keys, highlight the ATI/NVidia/X10 RF Remote (userspace) option and then click tab so that the '<Ok>' option is highlighted, then click the 'enter' key on your keyboard. The next window that appears is for a transmitter but we aren't transmitting anything so highlight 'None', click tab so that the '<Ok>' option is highlighted, then click the 'enter' key on your keyboard. If everything went ok lirc should have been installed and the proper config files should be in place. Now we just have to let XBMC know that we'll be using a remote instead of a keyboard to control it and that's done using a config file, Lircmap.xml which will need to be stored within your users .xbmc folder. To do this we don't need root privileges because we're writing the file within our own home directory which we have write access to. The tilde (~) is a short way of entering your users home directory, which is /home/yourusernamehere/, so the whole path is actually /home/yourusernamehere/.xbmc/userdata/. The command is as follows:
gedit ~/.xbmc/userdata/Lircmap.xml
I uploaded my Lircmap.xml to pastebin, you can download the file from this link: Lircmap.xml
Note the 'L' is capital, that's important for the filename. So that button presses don't register twice I had to edit the advancedsettings.xml file located within my users home directory .xbmc/userdata/ and add the following lines. You may or may not already have an advancedsettings.xml file, if you don't have one just create it, if you already have one just add the single line for the remotedelay as you probably already have the top and bottom <advancedsettings> lines.
<advancedsettings>
<remotedelay>10</remotedelay>
</advancedsettings>

That should be it, fire up XBMC and your ATI Remote Wonder should now be working. If you want to know which button on the remote does what within XBMC just look at the Lircmap.xml file with a text editor. Example being the "back" command in XBMC is performed with the remote button 'c' as per the following in the Lircmap.xml file <back>c</back>.

-Ubu out