Feeling Blue

December 26, 2010 Leave a comment



Feeling Blue

Originally uploaded by hkokko

This large male elephant seal was basking in the sun in the Piedras Blancas elephant seal rookery at San Simeon in the Coastal Highway, California. This guy seemed to be in pretty good shape for the beach

Categories: Uncategorized

Yellow Brick House

September 5, 2010 Leave a comment



Yellow Brick House

Originally uploaded by hkokko

I found this old yellow brick house during Open House Helsinki (http://www.openhousehelsinki.fi) walk yesterday. Strong colors and angles invited me to take many photographs even though the light was overcast. This city has lots of treasure for photographers if you persist in finding it.

OpenHouseHelsinki is arranged once a year and is a nice way of learning about usually closed areas of Helsinki.

Categories: Uncategorized

It’s been a HOT summer

July 24, 2010 Leave a comment



It’s been a HOT summer

Originally uploaded by hkokko

Hot weather has been with us all summer, Las Vegas, Death Valley, Yosemite, Willits, here in Finland.

Captured this sea of Coke bottles at a small carnival in Willits Frontier Days in California.

Categories: Uncategorized

Dreaming of Summer

April 5, 2010 Leave a comment



Dreaming of Summer

Originally uploaded by hkokko

It was like a dream from summers gone by, a dream from simpler days. So near but not reachable.

On the FHRA American car show this beautifully restored Ford Thunderbird was shown.

It must be spring as car fever steps in.

Categories: Uncategorized

Technical Debt – how to live with it

March 2, 2010 2 comments

Technical debt occurs in many, if not most software development projects. There are many varieties of technical debt: allowing your software get out of shape, allowing your software to depend on too old third party software and allowing your codebase to gather unreasonable amount of code that is built with bad coding practices.

It can be hard to measure the technical debt or its effect on product development. Individual classes or modules might look pretty much ok. The cumulative effect can be devastating to your velocity especially if the life span of the software is long.

Getting out of shape

Software gets out of shape with every change done to it if you are not observant. Every change is an opportunity to refactor the software for it to be easier to change in future. If you have not done proper amount of refactoring your codebase may get so out of shape that changing or adding features is so slow that it hurts your time-to-market and effort per feature is not competitive.

Constant refactoring is a way of keeping the software in shape. The cost of refactoring in the large (across many classes / modules) might be high if you do not have extensive and fast test harness to test whether the change really works. The temptation to do just the small change for the behavior of the individual method or class that is needed is high.

Refactoring the code might require refactoring many classes so the effort and risk to do that might be higher especially if the code is new to the developer and test coverage is low. Risk of not refactoring increases also when rapid changes to the code are needed due to project deadline pressure or urgent customer problem. Inexperience in development skills or at the particular language & tooling also add to the risk of getting software out of shape. In some cases there might not be proper tools available for refactoring or automating the tests. In those cases some investment on tool development can be beneficial.

Knowledge of code smells (from Martin Fowler’s Refactoring), techniques from Michael Feathers’ book Working with Legacy code and refactoring tools in IDE’s can help. Use of tools like Lattix or Structure101 can help in detecting whether your software is in proper shape. Currently the tools cannot provide a 100% solution even within one language but they are certainly helpful.

Test driven development can be a good practice to use as well but requires strong discipline and can be time consuming if the code base is already legacy. If your code base is already legacy (has little or no tests) the cost of refactoring in the large and TDD needs to be evaluated against business value, not chanted as a mantra.

Allowing your software to depend on (too old) 3rd party software

This chapter mostly relates to product software in large environments because those usually have longer lifespan than small custom projects.

One part of productivity comes from letting others do the work for you – at any time most of the innovation in IT domain is anyway happening outside your company. This means changes that can help you be more productive happen every day/week/month in the form of new versions of especially open source but also commercial software updates.

One of the key enablers for agile development is continuous integration – taking changes in constantly and testing them. You may not want to take third party software in at every patch and release they deliver but you should take them in at sensible intervals. What is a sensible interval: it is a reasoned compromise of the effort and risk of taking a new version and not taking it and having to take it in at awkward timing later. If you delay too long you will find out that this

  • may cause additional work in the form of workarounds
  • may cause incompatibilities of some other software that needs to work in your runtime environment
  • you may lose the productivity benefits of the innovations of this version for the duration of delay (for example: speed, convenience, new features, better abstractions…)
  • you may need to use older tools which are less effective
  • the porting effort might be larger
  • if you are forced to upgrade instead of choosing to upgrade you might block development of features that directly add customer value

This kind of technical debt is easy to accumulate. Advantages of doing the version change can be hard to quantify in business value. Some 3rd party software change so seldom that people might forget that a change might be needed if they are focused only on their current sprint. In addition the effect of living in the current version of the software is affecting most development activities, there might be little understanding what would be the every day benefits of the new version. There might also be bad memories of large effort spent on previous version upgrade.

One should build dependencies to third party software in a sensible fashion so that you can switch to a new version or even completely new library with reasonable effort. One might want to try out the new version parallel with the old one already as soon as it is available. This would also give visibility to benefits. This of course requires that you can afford to have the porting effort to new library done as a spike.

Being able to switch easily to completely new library is a balancing act as too 3rd party independent approach can fail to take advantage of the key attractors of the 3rd party software. One way to handle this could be to define at system level which of the 3rd party software you will not change and which you might. For those that you might want to change run tests with at least two different versions constantly in your continuous integration environment.

I would be interested in other people’s experience in how to detect whether your software is in shape and handling technical debt.

On the next agile category post I will talk more about code level technical debt and static analysis tools.


Categories: Agile, linkedin, Technical debt

Aperture 3 First Impressions – Positive

February 18, 2010 1 comment

After a really long wait Apple finally released a major version of Aperture, their professional photo management software. It has 200+ new features – a lot. Here are my first impressions after a few days of use as a hobbyist.

I have a 700+ gigabyte photo database mostly stored as managed database but with two years worth of pictures stored in reference, there are something like 50-60.000 pictures stored altogether. Each of the pictures has several keywords. The last 20 -30 000 pictures have been taken in RAW.  I am running this on two year old Mac Pro 2.66 with 9 GB memory and ATI HD4870 running 2 24″ monitors. Mac OS X 10.6.2. All the drives are fast internal SATA drives.

The upgrade took overnight and did not require major amounts of extra diskspace. The face detection took more than a day for 48000 faces of which it recognized only a thousand – very poor result as large proportion of the faces were about the same people.  Face search is quite quick.

Aperture recognized the few pictures which had GPS information nicely. The interface for handling geotagged pictures is very intuitive.

On the whole upgrade went without a hitch but would definitely recommend making good backup beforehand.

User interface feels even smoother than in Aperture 2. This is one of the major points for using Aperture. I have tried Lightroom (all releases) and it just feels clunkier.

I really like the new workflow when handling pictures.  When you edit a picture it is automatically made the pick so just methodically going thru your latest shoot suddenly is much smoother.  Labels make it easy and much faster to create a set of pictures for different audiences. Earlier you needed to create a special keyword for achieving the same purpose. Altogether these with the presets make it possible to take faster a first cut of the pictures to show.

Presets are very useful way of doing multiple adjustments quickly. There are already a few presets created by Sara France for MacCreate that are quite nice:  http://premium.maccreate.com/2010/02/09/aperture-3-presets-pack-volume-1/ You can of course create your own.

You can paint many if not most of the adjustments to pictures with brushes. Brushes are non destructive so you can go back and change the adjustments easily.

One of the most missed tools have been the Curves for Aperture. This gives you the ability to fine tune the image a lot more than could be done with levels earlier on.

Most of the external plugins are 32 bit so cannot be used without tricks. FlickrExport has already a beta out that works like a charm. Did not try Aperture’s own Flickr export.  Eagerly waiting for Nik Software to update their plugins as I have all of them. Would be nice if these were available as non destructible brushes but that might be too much to wish for at this stage.

Overall the new version feels slightly more responsive than the old one after the face detection has been completed. There are still some but very few cases where inexplicably the beach ball just spins for a long period of time.

Speed seems to vary. Quite often it is noticeably faster than in Aperture 2. Autoenhance preset for a 10mb RAW takes a fraction of a second.  Select a project and the pictures come almost immediately. Select a smart folder where you retrieve from the whole library varies from a fraction of a second to several seconds. Most often seems to be around a second.

Browsing in browse mode from 10mb RAW pictures with almost smallest size of picture selected scrolls quite nicely, bottom most row starts to draw when eye gets to middle of the screen. When you select a picture levels graph updates in around a second, more often under a second. In Aperture 2 this took longer.
When adjusting the picture levels by dragging the points no delays or almost no delays, this is slightly faster than Aperture 2 in my machine. In the earlier versions pictures which had been cropped or had been straightened caused delays – have not really looked into this so far.

Synchronization to the AppleTV worked like a charm after updating the AppleTV software.  Sync to iPhone also works well.

Some small things:

– Creating a smart folder with labels is not as intuitive as it should be – somehow the buttons selection is not really visible in the smart folder menu.

– Cropping has changed the default – grid is now by default visible when doing the crop.

So far Aperture 3 has been robust, no crashes but let’s see how it goes when trying out more of the features.

I am very satisfied so far with the upgrade in my environment

Maverick and Iceman

February 17, 2010 1 comment


Maverick and Iceman

Originally uploaded by hkokko

Spotted these two majestic Pelicans practicing formation flying at Monastery Beach just south of Carmel in the Coastal Highway in California.

If you drive down the highway reserve plenty of time to stop at the beaches and traverse the paths to coast.

Took a few hours and several spaces to explore until I got the shot I wanted

Categories: photography

Hello world!

February 17, 2010 1 comment

Agility in practice is what interests me professionally as a tool for productivity. Here will be the occasional observations about how to achieve that. I will also comment on  interesting developments or stories about this area whenever they do not fit on Twitter’s tweet limits.

Working in large scale agile software development and architecture are close to my heart and practicing them daily. Large scale here is anything involving dozens of teams working for the same release.

Continuous integration as a cornerstone of making agile development feasible and to help keep the rhythm has been in my focus for quite a bit.

Photography is a passion, taking pictures a joy, learning to photograph a lifetime journey. I will illustrate this amateur photographer’s journey.

I can also be found in Twitter @hkokko

Categories: Uncategorized