New show on ESPN – E:60

I know it’s crazy but it’s a little after 6am, but I missed the airing last night, and I have just been watching a new show on ESPN called E:60. It’s an investigative sports show and it’s pretty darn good. This show they had documentaries about gambling in college, 14 year olds being bull fighters in Mexico, Chad Johnson and Kurt Schilling. All the segments were very well done and pretty informative.

I highly recommend the show, it airs on Tuesday at 7PM, or in my case Wednesday at 5AM, on ESPN.

Claws-Mail – Part IV

Claws-Mail replaced the love I had for Thunderbird. I have been a Thunderbird user for many years and it was a great replacement for Outlook Express at the time.
I blogged before I couldn’t use Claws-Mail but after I switched back to Thunderbird again, things began to bug me. The layout and speed of Claws-Mail was something I was missing in Thunderbird. So I decided to change the way I was using GPG and signatures to accommodate Claws-Mail. I did this while still using Thunderbird and it was clear I could do it and switched back to Claws-Mail again.

I switched back just about an hour ago and I love it. I don’t feel bad about it unlike when I switched from from Claws-Mail to Thunderbird. I was always looking back at Claws-Mail and trying to see if I could make it work.

The Getdeb project.

I’m a Getdeb package builder and on the Ubuntu development discussion list there’s currently a thread about the Getdeb project and why we don’t participate in the backports program.

A little bit of history, I stumbled upon GetDeb when I was looking for Pidgin. Back then it wasn’t available in Feisty and I wanted to use the latest version. When I saw the website I got very excited about the goal to supply the latest software for Ubuntu.

The main reason why I started creating packages for Getdeb was the fact it was so easy to participate. I created an updated package and within two days it was up on the site. I tend to create packages I use myself or I believe it is a great asset to Ubuntu. This is one of the reason you won’t see me creating packages for games at the moment.

I did check to see if I could help out creating packages for as some call it, the inside Ubuntu community. All I could find was becoming a MOTU which is a whole process and I wasn’t, and I’m still not, ready for that. I didn’t know you help out in the backports without becoming a MOTU.

I will check out the backport process and see if I could help out there as well. I won’t abandon the Getdeb project, it’s a great project to participate in.

Using git – Part II

Well I can say I’m very happy with git. I actually use it now too to maintain this blog, not the posts them self but the layout, additional plugins etc. While doing this I ran into something odd.
Maybe a little bit of an explanation on how it’s et up right now. I have a main repository folder and a working repository, maybe it’s not the ideal situation but that’s the way it is.
Whenever I make changes I can check them locally as the working folder is also the folder for my local Apache, so I can see changes while I’m working. When satisfied I push the working repository and I update my remote website from the main repository, at least that’s the way I wanted it to work.
The first time I made changes I pushed them out and uploaded the folder to my remote website with FTP. I checked the remote website and the changes I made weren’t there. So I checked my local testsite and sure enough the changes were there. I pulled the main repository and it said everything was up to date, I pushed again and the same reply, everything is up to date. OK, now I’m confused. I checked the changed file in my working folder and the changes were there but when I checked the same file in the main repository folder and the changes weren’t there! What was happening? A git-status in the main repository showed the changes I pushed earlier weren’t committed. Time to search the Internet 🙂

The FAQ on the git website gave me the answer:

Why won’t I see changes in the remote repo after “git push”
The push operation is always about propagating the repository history and updating the refs, and never touches working tree files. Especially, if you push to update the branch that is checked out in a remote repository, you will not see the files in the work tree updated. This is a conscious design decision. The remote repository’s work tree may have local changes, and there is no way for you, who is pushing into the remote repository, to resolve conflicts between the changes you are pushing and the work tree has. However, you can easily make a post-update hook to updating the working copy of the checked out branch. The main problem with consensus of making this a default example hook is that they only notify the person doing the pushing if there was a problem. (see http://lists-archives.org/git/611684-contrib-hooks-add-post-update-hook-for-updating-working-copy.html or the earlier, more easily cut-able  and past-able version http://lists.zerezo.com/git/msg595210.html) A quick rule of thumb is to never push into a repository that has a work tree attached to it, until you know what you are doing. See also the entry (How would I use “git push” to sync out of a firewalled host?) in this FAQ for proper way to work with push with a repository with a work tree.

Ok, that explains it and it makes sense but I didn’t think about it. It just tells me I should RTFM before using software 🙂

Is Python the solution?

Before I got involved in the Ubuntu community I was briefly involved in the Fedora Infrastructure. Most of the tools they used were written in Python. When I got more involved in the Ubuntu community I noticed that Python was used a lot there too. I’m curious why two major Linux distributions choose to use Python so intensively. During my Fedora time I noticed many Python related activities weren’t being picked up and I asked the following on the Fedora mail list:

Just out of curiosity but why are our webapps written in Python and not in Perl for example?

I have a feeling there is more Perl knowledge among infrastructure specialists as there is Python.

The reply was simple, somebody started in Python a while back and Python was structured and when using Perl it’s very easy to create unreadable code.

I don’t agree that Python won’t lead to unreadable code as well. Now I don’t know Python but I know several other programming languages. I’ve been programming for over 25 years and have seen my share of good code and absolute garbage and it didn’t matter what language was used. I truly believe the difference between good code and garbage is the programmer, not the programming language. Sure certain programming languages can help in setting up a good structure and therefor it should make it a little bit easier to write readable code. I used to program in Cobol and I can say that was one programming language with lots of rules and structure. It sure helped but I had coworkers who’s code was horrible to debug or near impossible to extend.

Again, I don’t know Python, I have seen some programs and that’s it, so I don’t know how easy it is to write good code or make it completely unreadable. I will be teaching myself Python over the next few months as I would love to help out with some of the Python issues I see in the Ubuntu community as well. Maybe I even start a new blog series: “Teaching myself Python”. I don’t know how big the learning curve is but I’ll give it a shot.

download