New Nationalities in the Testing Blogosphere


Recently I’ve been adding some new weblogs to my overview of software testing weblogs. The overview now lists 267 weblogs on software testing. I do not claim to have an overview of all software testing weblogs, but I think I have quite a large number. So for me, it was strange to notice that in the last few days I added four weblogs from countries that were not yet on my list. Proof of the fact that blogging about testing is truly an international sport.

These are the newly added weblogs, their authors and their countries…


Martial Testing by Andrés Curcio and Ignacio Esmite


AskTester by Thanh Huynh and others


One Software Tester by Jason B. Ogayon


Mr. Slavchev by Viktor Slavchev


Slides of my presentation for the Agile Testing Days 2014


In November 2014 I spoke at the Agile Testing Days in Potsdam on the subject of using FitNesse to drive data through a legacy back end system. It was the first time I attended this crazy conference and I will share some of my experiences in another post. For now, here are the slides of my presentation.

A Software Tester’s Bookstore


I think it can be said that I am known for keeping lists. There is a simple reason, I like to collect resources related to testing. Mostly I do this in the fine spirit of Umberto Eco’s Anti-Library (see Nassim Nicholas Taleb’s The Black Swan for reference), to be aware of what’s there and what’s not. And, in the fine spirit of Rumsfeldian epistemology, to be aware of the fact that I’m unaware of what’s there and what’s not.

Anyway, for a while I used Google Books to track suggestions for and references to books that came across my path. I reached a total of 100 books (which is nothing, compared to the 50,000 Eco is said to have in his library) and thought it might be nice to categorize this list and show it on my website. I believe most of the books in the list contain at least bits (as in portions) of valuable information for modern-day testing.

I placed the books in an Amazon bookstore and I link to a flat list below. The categorization contains faults and could have been done in many other ways. The category ‘Management’ could, for practical purposes, be split up into ‘People management’, ‘Project management’, ‘Risk management’, ‘Leadership’ etc… These categories in their turn open up a wide landscape of sub categorizations. For now, I use a relatively flat categorization, to prevent anyone from getting truly lost. And, as categorizations go, they can never be false.

Yes, books and categories (such as coaching) are missing. As always, additions are welcome.

Here is a link to the store.

And a link to the list (PDF, opens in new window).


The oddness of two familiar words


Today, finally, the oddness of two words dawned on me. The words are ‘software tester’ and I used them, in a conversation, to describe my role in the project I am currently involved in.

The first reason why I felt strangely alienated from the description ‘software tester’ was that, for last couple of months, I have not tested software. Not a single piece of software has been been submitted to me with the question “Can you please test this?”. Instead, I’ve been reviewing (mostly functional) design. All I’ve seen is complex thoughts spelled out in writing.

Reviewing – from a testing perspective – is hard work. I have sat down with other reviewers and no one of these persons ever described reviewing as even relatively easy. It can be tedious and mind boggling at the same time. It involves reasoning from many perspectives. It involves recognizing the traps of your own reasoning and that of others. And it takes a lot of effort to find out which set of perspectives will lead to a manageable conclusion.

This brings me to the second word, which is ‘tester’. Today, through a couple of discussions, I realized that the tester – the independent assessor of quality – is as much bound to the context, as much the modifier of context as he is the assessor of the design. Though we would like to see ourselves as the evaluating party, it is this evaluation, this test, that is part of the design, part of the meaning of a system in its context. It reminded me that the design is bound to the tester just as it is bound to the designer. That means that we should be excruciatingly careful in selecting our approach to testing and that we should be looking at not only the result of the test but at the effect of the test on the design. The latter may turn out to be a much bigger issue than the former. Which is why I think ‘ tester’ is just a very odd word.

A benignly malicious list of books


In line with the good old context-driven tradition I’m presenting, in this blog post, a list of books. I decided to use the adjective ‘malicious’ to characterize this list. As far as I know there are no books in this list that have actually done harm. But I think the knowledge in these books may be used in a benignly malicious way. If software testing is about evaluating beliefs in software systems and if good software testing is relentless in exposing the value of those beliefs then we, as testers, had better be armed with the knowledge to dissect those beliefs, ruthlessly. And since we are not paid to expose only those things that make everybody happy, there is a certain malicious aspect in our craft.

So what I want is to be as ruthless as possible in my analysis. And in order to achieve that a certain amount of knowledge is neccessary. Also, to survive as a software tester, it may be neccessary to be five or twenty steps ahead when it comes to reasoning. This list is one of the ways to get closer to that goal.

I did not read all of these books. In fact, most of them are on my list of books that I want to read. Most of the titles are taken from philosophy and psychology. I also decided to leave out, as much as possible, books that are said to be life- or game-changing. In other words, I  want the straight stuff, from the source, not from the hype.

I think Nelson Goodman’s Languages of Art should be a very promising read. Furthermore Feyerabend’s Against Method seems to be one of the founding documents of context-driven testing. More on that book in some future blog post.


A satisfying testing experience on a rainy Sunday


The problem statement: an annoying pause

The issue was really quite simple: on all devices in a family home (three laptops, two tablets and one smart phone) the display of videos through the internet (whether through Youtube or another channel) was very slow. I was shown a video on the iPad. The beginning of the video was loaded, was paused after a few seconds because more of the video had to be loaded, the video ran on for a couple of seconds and then was paused again to load more of the video. This went on and on so viewing a twenty minute video was a very annoying, gruelling, experience. The lagging video display had been frustrating the family for quite some time.

I was visiting the family. It was a Sunday afternoon. It was pouring.

The system setup

There is a home network to which, through a wireless router, the three family laptops all running a Windows operating system are connected. Also using the wireless router are two tablets, one iPad 2 and another unidentified tablet running some version of the Android operating system. Also, one (Android) smart phone will probably use the wireless router for internet access. Not all devices are switched on at the same time. The wireless router is linked to a DSL modem. Both modem and router are of the Cisco brand, types of modem and router were not investigated.

First inference

Since all devices suffered from the same lags in loading and displaying videos I concluded that neither the operating systems nor the devices were the first candidates for investigation. In fact the speed of the internet connection would be a good starting point for the investigation.

The requirements

At that point two things needed to be established. One, the expected speed of the connection and two, the actual speed of the connection. The expected speed of the connection was established by looking up the service contract with the internet provider. The download speed should be 50 Mbps.

The first test: how fast is the internet connection?

With this in mind it was time to investigate the actual speed. This could be done with an online tool on the website of the provider. But since I wanted a second tool to verify the results of the tool offered by the provider I found a nifty little iPad app that also measured the speed of the internet connection.

Using these two applications, one on one of the laptops and one on the iPad it was established that the download speed of internet connection was at least fifteen times slower than the speed promised by the provider. According to the tool offered by the provider was about 3 Mbps (several measurements in a time frame of a couple of minutes) and the speed according to the iPad app was even less; 0,5 Mbps ( (again several measurements in a time frame of a couple of minutes).

Notice that there is quite a difference between the measurements with the respective tools. I decided not to bother with the difference since a) it would probably be impossible to find the cause of the difference and b) because both measurements clearly indicated that the connection speed was (very) low. The classification high, medium, low, very low contained enough granularity for me at that time.

Another thing that struck me was the the upload speed was actually higher than the download speed. Normally, in DSL lines, the download speed is significantly higher than the upload speed. I did not investigate into this further but it may have been a pointer towards the reason for the problems.

I was told that the wireless router had fallen at least once. The owner of the router thought that that may have caused a defect in the router.

The second test: is the wireless router the cause of the problem?

To eliminate the router as a cause I reset it a couple of times, without any effect on the download speed. Then I decided to take the router out of the equation. I connected the laptop directly to the modem. Should the router be the cause of the troubles then with this new setup I should have a good download speed. It was possible to connect the laptop to the modem but there was no internet connection. This was the point at which I was baffled a little bit. I thought connecting the laptop to the modem should be a plug-and-play thing. But apparently this was not the case.

To see if I could get the internet connection working through the modem I rebooted the laptop. Without success. Then I started paying attention to the suggestions that Windows offered by which I should be able to fix the connection. One of them was to adjust the DHCP settings. Since I am not a wizard at network settings, I decided to leave that alone at first. One other suggestions was to reset the moden.

The solution: beaten by Windows

Why hadn’t I thought of that before? I was beaten to the chase by Windows! In the early days of DSL I had tinkered with modems and routers for the better part of an afternoon to get the network going. I remembered having tried all sorts of boot and reset sequences. The one I still use infrequently is to shut down the router, shut down the modem, boot the modem, wait till it’s fully functional, then boot the router, wait till the router is online and then start the pc or laptop.

I skipped the ‘internet connection through the modem‘ test and applied the somewhat familiar boot sequence. Then I grabbed the iPad and hit the speed test application. It now showed a very nice and satisfying download speed of about 20 Mbps. I was happy to report that the internet connection speed had just increased by a factor 40.

The deliverables: happiness despite the rain

But was that 20 Mbps speed really satisfying? Our requirement said that the speed in fact ought to be 50 Mbps. But then I remembered that DSL connections seldom reach the specified speed. It has something to do with the distance from the switch, the number of houses using that connection, possible the quality of the ethernet cables and interference from other devices. Interference could not be ruled out because the modem and the router were close to a phone, a television, a dvd player and such.

I could have done many other tests in search for the cause of the issue. I could have finished the ‘internet connection through the modem‘ test. I could have upgraded the firmware of the router and the modem and test the effect of that. I could have tried a different ethernet cable. But the result of the session was reasonable and spending more time on testing seemed like a waste. I handed the iPad to the owner and he conjured up the video – a cooking instruction video – that had been plaguing him for quite some time. This time it ran without a flaw. As an extra test on the laptop I loaded a lengthy video in Youtube (Goranka Bjedov – Using open source tools for performance testing). Goranka came through clear and without a hitch.

Though the actual cause of the error had not been found I was happy to hand to the family two testing products that may help them in the future: 1) a nice little iPad tool to measure the internet connection speed and 2) a procedure to reboot the internet connection just in case.

I like to think I delivered exactly what was needed.

A primitive attempt at analysis


Jari’s challenge

A couple of days ago I browsed through the puzzles (testing challenges) posted by Jari Laakso on his weblog. I picked one that I found particularly interesting and tried to come up with an answer. I sent an e-mail to Jari and he replied, thanking me for my input. I did not provide the explanation that was on his mind, but this was mainly due to the fact that I should have tried to solve the challenge in a direct discussion with him, rather than by e-mail. I intend to do this later on, but in the meantime Jari’s challenge nevertheless got me thinking about analysis; the way I tried to tackle his challenge. So thanks to Jari for starting this train of thought!

The challenge is Testing Challenge – Puzzle #3 and goes as follows.

There is a 15 year old boy studying in a high school. He loves ice hockey and is the best of the team from his year. The team has been excellent in the high school championships. Recently, the dean and the teacher’s council had a meeting where they decided he is so good they must dismiss him from the team. Explain why.

Now clearly we need some form of reasoning to find the possibilities that are hidden in this text and to account for those findings. We need our old school analytical skills; the skills that have been honed to perfection through years of intensive study, training and practice.

What the holy book has to say

One of the corner stones of our craft is the 752 page testing bible TMap Next.The book has a lot to say on analytical skills, particulary in chapter 4 Foundations of Software Testing – Applying the Analytical Mind. While this chapter dives deep into the history of analytical thinking, scientific reasoning and experimentation, it also provides what I think is the best argument for the all pervading importance of analytical skills in our craft. Chapter 4.2.1, right down the middle of page of 154 goes as follows.

To fully understand the testing experience a thorough grasp of analytics and reasoning is imperative. In the previous chapter we concluded that within any software development effort the tester is confronted with systems, models of systems, descriptions of systems or opinions on the functioning of systems that are incomplete, ambiguous, inconsistent, ill structured, contradictory, distorted or just plainly incorrect. And yet this information is seldom presented accompanied with a list of footnotes explicating in full detail its inconsistenties. Why? Because the tester is presented with information that most other people believe, at the very least partly,  to be correct, justified and validated. The task of the tester is to confront people with their beliefs, to show not the ways in which the system will succeed, but the situations in which the beliefs we hold may differ from what we encounter in real life. This is the test; and the tools we supply to the tester are the ways to reveal what is hidden. Within this set of instruments we find logic and reasoning and at the same time instruments to disrupt logic and reasoning. In the remaining parts of this chapter the most important of these instruments are discussed.

Armed with these instruments we move toward the challenge.

The challenge

There is a 15 year old boy studying in a high school. He loves ice hockey and is the best of the team from his year. The team has been excellent in the high school championships. Recently, the dean and the teacher’s council had a meeting where they decided he is so good they must dismiss him from the team. Explain why.

First there is a psyhological hurdle we must overcome. The reason for the dismissal of the boy does not seem logical; we live in a society in which being good at something is valued, praised and awarded. Being dismissed from a team at first glance hardly seems to be an award. So the challenge is to explain this ‘ unlogical’ dismissal. If the dismissal were ‘logical’ it would hardly require explanation. Since I decide that the dismissal is unlogical I have a number of assumptions I must validate. For example, I assume that the boy lives in modern times and not a couple of hundred years ago. The best clue in the text as to the time in which this challenge must be placed is the fact that ice hockey is mentioned. If we take Wikipedia as a valid source we learn that modern ice hockey started in 1875 in Montreal.

The second assumptions is that the boy lives in a modern Western culture not unlike mine. Other cultures may have totally different views of what is ‘good’ and what ‘dismissal’ means. In fact I think this challenge can only be ‘solved’ by someone from a Western culture. Luckily evidence like ‘ice hockey’, ‘high school’ point to our cosy modern Western world.

As a side note I think the information systems we create highly depend on the society in which they are implemented and on the time they are implemented. If you take this one step further you may as well state that the information systems we create are mirror images of (aspects of) the societies we live in. If you take it yet one step further you can state that information systems are elements of social systems and information systems are largely social, not technical.

Back to the psychological hurdle. The boy is dismissed because he is so good. If this is true then maybe the boy’s excellence may yet still lead to a reward, for example being promoted to a better team. On the other hand, if no reward follows then we must look at the motivations behind the dismissal. The only motivation offered by the text is that the boy is so good. But other, hidden motivations may be behind this statement. We have to look into psychology. Being ‘so good’ can be seen as a compliment and maybe this compliment is given to cheer up the boy. He may have been dismissed for other reasons (such as being a bad team player (in my mail to Jari I used the football player Christiano Ronaldo as an example)) but the dean and the teachers choose to be kind to the boy and stress the fact that he is good. Maybe other players on the team felt ill at ease with so much talent around and the team, rather than the excellent player, had to be saved.

There is a world of possibilities behind the inner workings of the meeting of the dean and the teachers’ council. There are many things that the text does not reveal; we only see the curtailed conclusion. Within the council there may have been a teacher who felt that the boy’s grades suffered from playing ice hockey so much and pleaded for his dismissal. To Jari I even offered the suggestion that bets were placed on the ice jockey games (unlikely, but interesting for the sake of argument) and that the dean or the teachers had been bribed to make the team fail. The boy may or may not have parents who play a part in the decision. No parents are mentioned in the text and this is odd in some way because parents are usually involded in school issues.


If we want to apply analysis we must learn how to reason. We must learn how to recognize and analyze conclusions, reasonings, facts and assumptions. How do I judge the conclusion in the challenge? I know it is short and illogical in my world view.  There are numerous gaps in this conclusion and I will have to apply reasoning to locate them. How do I judge the entire challenge within the light of my limited modern Western world mindset. In the book The Order of Things the French philospher Michel Foucault tells us that reading the classification of animals stated in the Celestial Emporium of Benevolent Knowledge (by the Argentine writer Jorge Luis Borges) broke “up all the ordered surfaces and all the planes with which we are accustomed to tame the wild profusion of existing things”. He further argues that a categorization changes when you change the background (context) of that classification. The concept, of course, is simple: if you place a black square in a black background you will not even notice it is there. If you place it in a white background you will see a black square. Has the square changed?

Some other friends that accompany us during analysis are quantification, time and place (setting). From, for example, movies, we learn that time is not absolute, that sequences of actions do not have to be consistent in time, that time may be sped up, slowed down, reversed etc… Similarly any quantification, or lack of quantification may hide or reveal gaps. In the challenge the boy is 15 years old, he studies in a (any?) high school, there is a “teacher’s council”, how many teachers are in that council, do any of those teachers actually teach in the classes in which the boy participates?

Lately I’ve been thinking a lot about ‘what is missing‘ and what reasoning we can use to find what is missing. There is a nice book by the Austrian philosopher Paul Feyerabend with the rallying title Against Method. I cannot think of a reason why a software tester should not read this book. Feyerabend is all about context and he does nice things with Galileo Galilei. He uses Galilei’s reasoning in favour of the movement of the earth (around the sun). What Galilei did, according to Feyerabend, was to show something that was missing by reasoning. In  Galilei’s case the movement of the earth was rejected because a rock, falling from a tower, falls straight down. If the rock falls straight down it proves that the earth does not move. Galilei argued that there are movements that we cannot percieve, and by proving this demonstrated that we at least must use other means of perception to draw conclusions about the movement of the earth.

Within the history of science, philosophy,sociology, biology etc… there is a whealth of reasoning. I think that it can be demonstrated that the tester who can analyze from a broad base of reasoning adds immense value to any project.  So if we must learn anything to add value to our craft, it is this. I hope this is a falsifiable theory.