Archive for the ‘Computer’ Category

What’s going on with Ubuntu Certified Professional?

Monday, March 25th, 2013

Last week, I took the official Apple “Mac OS X 10.8 Essential Support Course” followed up by the official test. I passed so I’m a Apple Certified Technical Coordinator (ACTC) on top of all the other acronyms I hold.

Although I don’t work on Mac OS X every day, I have a good working knowledge of the general handling and the underlying OS. The course, which ran at quite a fast pace, summed up all the important points very nicely. The test featured the occasional tricky question and a score of 73% or higher was required to pass. And I passed.

We used the official Mac OS X 10.8 course book which contains precise information on Mountain Lion (although to be fair, the author probably only had to replace ‘Lion’ with ‘Mountain Lion’ to release a ‘new’ version or so). I actually understand now what happens after the kernel is loaded and what processes produce the login screen and what happens when a user logs in and so on.

On the other hand, the last official book on Ubuntu Certified Professional (UCP) was released in 2008 and was already out of date half a year later because of the energetic activism the good people at Canonical display all year round. No wonder that with all the changes that happened to Linux and all the changes that Ubuntu brought on itself, I still don’t feel secure about the internal workings on Ubuntu. Sure, there’s source code but I don’t think anyone actually reads that to get a general understanding of an OS. The man pages? Please! You mean those cryptic writings where the overview section is never really helpful because you need to have a PhD for reading man pages in order to understand them? Ah yes, the lack of useful examples is another gripe I have with man pages.

After passing LPIC 1, I was all fired up to become an UCP as well. But the lack of concise information put me off and the ever growing gap between the OS and the documentation put me off even more. Until today, no update to the Ubuntu Certified Professional book (available on amazon.com) has been released. I guess, even the author got fed up and felt he could use his time in better ways.  I sincerely doubt anything useful will be released in the future on that particular topic. And with Canonical pushing Ubuntu into a its own niche a bit more with every release, Ubuntu will have a hard time to become a viable candidate to compete against Windows in the enterprises – if that was ever their goal. Accordingly, the value of being a UCP shrinks and shrinks. Actually I’ve never really met anyone who was certified.

Maybe I should focus on LPIC 2 again, too…

wget, busybox, ipv6…

Thursday, February 28th, 2013

In our work environment, we use pxe and preseed to roll out Ubuntu 12.04 installations. In the late command, we even splat puppet standalone and a subversion repository on it. Works nicely.

Last week however, this tested routine stopped working in a very early stage – when wget in busybox tries to figure out which release of Ubuntu to pull from the mirror server.

The actual code line was: wget -q ftp://somehost.domain.local/file -O – | grep -E ‘^(Suite|Codename):’
This line didn’t return anything, so the installation got stuck. On my pc, the same line returned something useful.

In the end, the problem was that ipv4 and ipv6 were configured automatically and wget in busybox prefers ipv6 over ipv4 but if wget fails, it does not try again with ipv4. And since it comes with busybox, you can’t force a protocol family (see ‘man wget’ for further infos).

The solution was to supply another kernel parament in the pxe configuration: ipv6.disable=1

There are some users who report that this line does not work for them. Maybe they got the spelling slightly wrong or they put it at the wrong place but with this additional parameter, our setup is working again.

The sad thing about this is that this problem was already bug-reported in 2007 but the status was put to ‘wontfix’ and closed. Way to go 🙁

And I only wasted about 6 hours of working hours for that crap.

 

Pictures to movie: A simple example of avconv

Sunday, February 17th, 2013

In 2011, I travelled from 豊岡 (Toyooka) to 京都 (Kyoto) by train. Every couple of seconds I took a picture with my camera. Since then, I wanted to merge all these pictures into a movie, if possible with some music.

I tried the better known video editors such as OpenShot Video Editor, Pitivi Video Editor, Blender and such but never really got far. Either pictures couldn’t be mass imported or they couldn’t be distributed evenly etc. etc.

Finally, a friend mentioned ffmpeg last week so I tried it again. Unfortunately, ffmpeg seems to be deprecated but they recommend to use avconv instead which comes with “libav-tools”

Format converting is an art of its own, but if you just want the basics, all you need is the following command:

avconv -f image2 -r 3 -i ./%04d.JPG -i soundfile.wav -c copy -crf 20 output.avi

Run this from the folder where your *.JPG pictures are and you will get a movie “output.avi” including a soundtrack from soundfile.wav. “-r 3” means 3 frames per second, so play around with this value if you a ‘faster’ or ‘slower’ movie.
Something else to keep in mind: The pictures must be named in numerical order: 0001.JPG, 0002.JPG etc.
If you don’t have your pictures named like that, try this command (all on 1 line in the terminal):
i=0; for f in *.JPG; do ln -s $f $(printf “%04d.JPG” $i); i=$((i+1)); done
This will create symlinks for all .JPG in your folder in the order that `ln -s` would output them.

This is the result: The final video

As usual, a couple of simple examples in the man page would have been helpful.

Installing Galaxy on CentOS 6.3 with an mysql db and running it as a non-root user

Friday, February 8th, 2013

There’s a biomedical reaseach software called Galaxy. I didn’t know that either 😉
The installation is easy but it uses a sqlite ‘db’ and must be started by whoever wants to use it. In a production environment, this is not convenient and does not scale nicely. To be fair, the makers provide infos on how to run it in a production environment.

Here is one such installation in details. maybe this helps you.
-OS: CentOS 6.3
-DB: mysql
-Galaxy is run by a non-root user
-Galaxy starts at system boot

Lines starting with # must be run as root, some lines are comments so you can’t just paste line by line in your shell. Make sure you understand what you do (the line breaks make it a bit hard to read though, sorry)

After the installation, open firefox. To use galaxy, visit localhost:8080

**************************

===========================================================
= Installation of Galaxy with a local mysql DB on CentOS6 =
===========================================================

mysql
=====

# yum install mysql-server
# yum install mysql
# yum install mysql-devel

# service mysqld start

# /usr/bin/mysql_secure_installation

Set root password? [Y/n] Y
root pwd: <pwd>

Remove anonymous users? [Y/n] Y

Disallow root login remotely? [Y/n] Y

Remove test database and access to it? [Y/n] Y

Reload privilege tables now? [Y/n] Y

(http://wiki.galaxyproject.org/Admin/Get%20Galaxy)

(sets mysqld to start on reboot)
# chkconfig mysqld on

add another db user
——————-

/usr/bin/mysql -u root -p (enter pwd)

mysql> INSERT INTO mysql.user (User,Host,Password) VALUES(‘galaxy’,’localhost’,PASSWORD(‘<pwd>’));
mysql> FLUSH PRIVILEGES;

create a galaxy db
——————

mysql> CREATE DATABASE galadb;

grant user ‘galaxy’ all permissions on db ‘galadb’
————————————————–

mysql> GRANT ALL PRIVILEGES ON galadb.* to galaxy@localhost;
mysql> FLUSH PRIVILEGES;
mysql> quit

mercurial
=========

# yum install mercurial

galaxy installation
===================

# cd /usr/local
# mkdir galaxy
# cd galaxy/
# hg clone https://bitbucket.org/galaxy/galaxy-dist/

# sh galaxy-dist/run.sh

–> starts a local galaxy instance, can be opened in a browser with localhost:8080
^C –> quits

change settings for production server
=====================================

(http://wiki.galaxyproject.org/Admin/Config/Performance/ProductionServer)

disable developer settings
————————–

cd /usr/local/galaxy/galaxy-dist/
# cp universe_wsgi.ini universe_wsgi.ini.orig
# vim /usr/local/galaxy/galaxy-dist/universe_wsgi.ini
(line 370) debug = True –> debug = False
(line 383) use_interactive = True –>  use_interactive = False

use a local mysql db
——————–

set db connection in universe_wsgi.ini
(line 93) database_connection = mysql://galaxy:<pwd>@localhost/galadb?unix_socket=/var/lib/mysql/mysql.sock

securing the galaxy installation by running it as non-root
==========================================================

(create a local user “galaxy”)
# useradd -c “local user for galaxy installation” -d /home/galaxy -m -U galaxy
# passwd galaxy <pwd>

**********************
* running galaxy with the local user galaxy will throw an error
*
ssh galaxy@host
[galaxy@host ~]$ sh /usr/local/galaxy/galaxy-dist/run.sh
–>
OSError: [Errno 13] Permission denied: ‘./database/tmp/tmpeeJTbo’
*
* so we need to fix this by chowning the installation folder to galaxy
**********************

# cd /usr/local/galaxy/
# chown -R galaxy:galaxy galaxy-dist/

**********************
* now it should run
ssh galaxy@host
[galaxy@host ~]$ sh /usr/local/galaxy/galaxy-dist/run.sh
Starting server in PID <PID>.
serving on http://127.0.0.1:8080
* yes, it does
**********************

crontab fuer user galaxy:
SHELL=/bin/sh
@reboot $SHELL /usr/local/galaxy/galaxy-dist/run.sh >>/tmp/galaxy.log
**********************
* –> galaxy will run after the next reboot
* as the log file is in /tmp, it delete disappear after a reboot
* put it into /var/log and chown it to make it more persistent
**********************
* after reboot, you can check if galaxy was really run at system boot: * [user@host ~]$ ps -ef | grep gala
* galaxy    2864  2862  0 15:44 ?        00:00:00 /bin/sh -c $SHELL /usr/local/galaxy/galaxy-dist/run.sh >>/tmp/galaxy.log
* galaxy    2865  2864  0 15:44 ?        00:00:00 /bin/sh /usr/local/galaxy/galaxy-dist/run.sh
* galaxy    3148  2865  2 15:44 ?        00:00:07 python ./scripts/paster.py serve universe_wsgi.ini
* galaxy    3180  2862  0 15:44 ?        00:00:00 /usr/sbin/sendmail -FCronDaemon -i -odi -oem -oi -t -f root

=================================================

**************************

Using Powershell to keep your Windows server/ws from cluttering

Friday, February 8th, 2013

This blog entry and the script will give you an example of how to use Powershell to:

-specify a ‘model’ file
-specify a directory search and fill an array with filenames you filtered out
-work on each entry of the array and to something with it e.g. delete the file
-writing every into a log file

I use the script to delete huge dp dumps and delete them if they are the same size as the original db. Yes, the logic is flawed but for the moment, it does its job. The script is run by task scheduler on a daily basis.

Basically, the script does what it says above. All the required variable can be, no, must be set within the script. The script does not accept any parameters. For you convenience, you can also download it from here:
dbbakcleanup.ps1

here goes:
*************************************************************

# automated bkp cleanup:
# scans a directory, deletes files of the same size as a provided model size
# and writes a summary in a specified log file
#
# ideas for improvements:
# 1) –dry parameter to prevent any action
# 2) write log to event viewer
# 3) ?
#
###########################
#
# variables
#
###########################
#
# General error code, used to quit (exit) the script if not 0
$GenErrCode = 0

# array keeping all log entries which will be dumped into $logfile
$log_lines = @()
# `n <– inserts a newline in the logfile
$log_lines += “`n”
$log_lines += “**********************************”
$log_lines += “`nINFO: Date: ”
$log_lines += Get-Date
$log_lines += “`nInitializing…”

# where to look for files
$mydir = “%windir%:\path\to\my\dir”
if( Test-Path $mydir ) {
$log_lines += “`nINFO: Dir $mydir found, continuing”
} else {
$log_lines += “`nERR: Dir $mydir not found, stopping…”
$GenErrCode = 1
}

# reference file for size
$modelFile = Get-Item ‘%windir%:\path\to\my\file’
if( Test-Path $modelFile ) {
$log_lines += “`nINFO: File $modelFile found, continuing”
} else {
$log_lines += “`nERR: File $modelFile not found, stopping…”
$GenErrCode = 1
}

# filter the files by part of their file name when filling $my_array
# as an example, i specify the file extension. it could also be part of the file name.
# the filter gets applied in 52. if the filter is not a file extension, try setting the * before $pattern
$pattern = “.mdf”

# using $pattern to filter files for filling the array…
$my_array = @(Get-ChildItem -path $mydir\$pattern* | Sort-Object Name -Descending)
if ( $my_array.length -gt 0 ) {
$log_lines += “`nINFO: Array my_array not empty, continuing…”
} else {
# array empty, should not happen so just in case…
# in the line below, the name of the array is hardcoded (my_array), improved could use a variable instead
$log_lines += “`nERR: Array my_arry empty, stopping…”
$GenErrCode = 1
}

# array containing names of deleted files
#$del_files = New-Object System.Collection.ArrayList
$del_files = @()

# specify the name and location of the log file
# could be in a different directory than $mydir
$logfile = “$mydir\my_log.txt”
if( Test-Path $logfile ) {
$log_lines += “`nINFO: Logfile $logfile found…”
} else {
$log_lines += “`nINFO: Logfile $logfile not found, creating new logfile…”
}

# exiting script if an error occurred during initialization…
if ($GenErrCode -eq 1) {
$log_lines += “`nERR: GenErrCode not 0, writing logs and exiting…”
# redirect @log_lines to $logfile
Add-Content $logfile $log_lines
exit
}

#########################
#
# end of initialization
#
#########################
#
# functions
#
#########################

function fileProcess($filejob)
{
# during testing, use the whatif line, a dry-run function within powershell
#Remove-Item $filejob -whatif
Remove-Item $filejob -whatif
}

#########################
#
# end of functions
#
#########################
#
# script
#
#########################

$log_lines += “`nINFO: Starting mainloop…”

$filecount = 0
ForEach ($objElement in $my_array)
{
# compare file size, if equal, call fileProcess to delete file
if ( $objElement.Length -eq $modelFile.Length ) {
fileProcess $objElement
$del_files += $objElement.Name
}
$filecount += 1
}

$log_lines += “`nINFO: Counted ” + $filecount + ” instance(s) to process…”

# all the lines starting with Write-Host were used for troubleshooting
# feel free to uncomment them to get some additional information during runtime
#Write-Host “filecount is” $filecount
#Write-Host “del_files is” $del_files

if ( $del_files.length -lt 1) {
$log_lines += “`nINFO: No files deleted!”
} else {
if ( $del_files.length -eq 1 ) {
$log_lines += “`nINFO: Deleted this file: $del_files”
} else {
$log_lines += “`nINFO: Deleted these file: $del_files”
}
}

#Write-Host $del_files
#Write-Host $log_lines

#
$log_lines += “`nScript finished, writing logs…”
Add-Content $logfile $log_lines

#########################
#
# end of script
#
#########################

*************************************************************

I’m sure there are suboptimal things in the script. Please take it as it is, improve on it, use it as you like.
I wanted the function to do more than just deleting the file – I also wanted it to write to the log file but I couldn’t get that to work.

HowTo transfer files from Ubuntu 12.04 to your Huawei Smartphone

Wednesday, January 9th, 2013

For some reasons, Google dropped the USB mass storage file transfer in Android 4.x in favor of MTP which makes it hard somewhat hard to transfer files to and fro an Android-based Smartphone if you don’t have a Windows of Mac OS X installation around. This move has not been very popular with a lot of users who use Linux, especially since Android is Linux based, too.

However, apart from ftp’ing ssh’ing files, there is an alternative: Darran Kartaschew brewed up his own simple digital file player called gMTP

On Ubuntu, it can be installed via sudo apt-get install gmtp

Afterwards, connect your Smartphone to the PC with whatever USB-cable that came with the phone and (on a Huawei Smartphone), change the USB setting to “HiSuite”. gMTP will ask if you want to mount the SD-card or the device itself and there you go. If you select the device, you can browse the complete Android installation. No rooting required.

gMTP’s GUI looks somewhat rough and transfer is not that fast but for the occasional transfer, it should be enough. It’s still much better than iTunes 11 😉

HTH

Custom-sized Pearl…

Tuesday, January 8th, 2013

I wrote about Pearl before. This time, prepare for a custom-sized board! Originally I wanted to try a 25×25 board but even with an 8 core cpu und 16gb of RAM it takes forever to be created. So I settled for a 16×16 board.

After drawing all the obvious lines, it looked like this:

pearl_16x16_prep

And based on this very good preparation it took not even 10 minutes to finish the game:

pearl_16x16_finish

Back to programming and system engineering!

Ubuntu’s recent developments and my opinion on them

Monday, December 31st, 2012

Howdy… one last blog entry in 2012.

Ubuntu has come a long way and in the recent releases has introduced a lot of changes, e.g. the global menu, Unity, HUD, the scrollbar…

My opinion on most of these changes: They do not hold what they promise and I could do without them. It’s thankfully easy enough to uninstall most of them.

On Unity
A lot has been written on Unity. A fair amount of people don’t seem to like it, others seem to get along well. I’m somewhere in the middle: Navigating around can be achieved using a keyboard only, which is great. Other aspects are not as positive: The dash is fixed to the left side of the screen, it’s not possible to attach widgets to the top bar like it was in Gnome2.
One thing I sorely miss: An overview of all installed applications. Other than typing all the letters of the alphabet in the dash to see what applications come  up, I haven’t been able to find a workaround.

On the global menu
As far as I understand, “GUI experts” explained that the global menu results in less mouse movements and aid the user because the menus are all in the same place.
Unfortunately, I work on a 27” screen at the office. I permanently have several programs and or windows open and the global menu is not only confusing but also requires me to move the mouse around much more than without the global menu.
Good thing though that the global menu can be removed: sudo apt-get remove firefox-globalmenu thunderbird-globalmenu appmenu-gtk appmenu-gtk3 appmenu-qt
They have a global menu and Mac OS X as well. I don’t like it there either. The difference to Ubuntu’s global menu: It can’t be removed.

On the HUD
So far, I’ve only activated this feature occasionally by mistake and closed it again as quickly. Either I know a key combination by heart because I use it that often or I have to look around in the menus resp. I want to look around in the menus.

On the reworked scrollbar
Another one of changes that weren’t really necessary in my opinion. The old scrollbar worked just as well and the few pixels saved don’t justify a new scrollbar which sometimes is hard to click on and move. Even worse, whereas most dialogue windows used to be completely resizable in any direction, some are now fixed in size e.g. the text field below the update application. It’s now so small (and can’t be resized) that no useful information whatsoever can be read there.
The scrollbar can be reverted to the old setting in 12.04 like that:
gsettings set org.gnome.desktop.interface ubuntu-overlay-scrollbars false

On the shopping lens
Well, not much to say about this one. Not only was there an outcry by many Ubuntu users, even the EEF thinks this one reeks. And it does. To remove it:
sudo apt-get remove unity-lens-shopping

In conclusion, my feelings towards Ubuntu have become really mixed. Where Debian used to have no sparkle compared to Ubuntu, Ubuntu was just as stable but came with newer software. Now, Ubuntu seems to be running into a direction that many users don’t seem to enjoy who consequently leave the ship for e.g. Mint. Despite everything I wrote here, I’m still reasonably happy with Ubuntu 12.04 (not so much with 12.10) and I’m following the development of 13.04 closely, as ever. I don’t plan on changing the distro (yet), also because I wouldn’t know which one to pick. Fedora is releasing too quickly with too many experimental features and besides I can’t stand rpm-based distros. I’d have to give Mint another try or maybe Debian.

Enough ranting for this year. Let’s keep or fingers crossed for next year!
cheers!

A useful command line tool: watch

Tuesday, November 13th, 2012

It’s 2012, almost 2013, I’ve been working with Linux for 9 to 15 years (depending on how you count) and yet occasionally I come across useful things unknown to me once in a while:

Enter ‘watch’

In a sense, it’s working the switch ‘-f’ in ‘tail’ but it’s doing its magic on files and commands.

By default, it displays every two seconds whatever command you supply it with. Rather than hitting up arrow and Enter every two seconds, issue one elegant ‘watch ps -ef | grep myprocess’ and lean back while your fellow linux newbies play the old hackin’-away-at-the-keyboard game.

Other than ‘have a look at the man page!’, I can’t really think of anything else left to write to here, so my job is done here.

tk,
m.

Almost one grand of mnemosyne entries, yay…

Thursday, September 27th, 2012

Wow, just look at that 4-digit number in the lower right corner (not marked in red, so you have to look harder ^_^)

I wonder if mnemosyne will crash if I add another entry? Or maybe the universe will collapse.

Anyway, I just got started on mnemosyne about four years ago and have since added 10⁵-1 entries, ranging from vocabulary in Japanese, Chinese, Italian, keywords for the  LPIC tests and more. Only a few were imported from lists, all others were inputted by hand.

This software really has helped me a lot memorizing all that vocabulary and I can only recommend it. Sometimes it’s hard to do these exercises every day but it’s definitely worth it.