Wednesday, August 15, 2012

The changing face of Linux gaming

Anyone who is interested in gaming on the Linux environment must have heard the news of Valve porting the Steam client as well as a number of AAA games to run natively on Ubuntu. What I am going to be proposing here in this article is not based on any insider information. I don't know know Gabe personally nor do I have any insight into Valve as a company or their ultimate goals. But if you look at the things that have been said and the direction the company is staking, there can be some very interesting conclusions drawn.

First of all, lets go through what has been said and done by Valve so far. A few months ago, Gabe Newell came out and said (and I'm paraphrasing here) "Windows 8 is a catastrophe for games and game developers". He should know. He is an ex Microsoft employee and probably has a lot more insight into what is coming from Redmond in their next incarnation of the Windows OS. There's our first point to consider. Windows, the dominant gaming environment for PC, is moving in a direction that does not please the big games developers (Blizzard has apparently made similar statements).

Valve have been working hard on developing for Linux. They have hired a lot of top talent and have already gotten to a point where Left For Dead 2, their premier game at the moment, is running on Ubuntu. In fact, more than just running as the latest post on the Valve Linux blog shows, getting the game to run faster in Ubuntu than in Windows.

Valve has also made numerous statements that they want to work with Intel (and potentially other GPU hardware makers) on making the open source drivers for the GPU's a lot better. As well as looking into submitting patches to the Linux kernel to improve its capabilities in running games.

With all this going, I am seriously led to believe that Gabe (and by extension, Valve) is looking to make the Linux desktop its premier platform for gaming. In other words, focussing primarily at the Linux desktop instead of Windows as they do currently when it comes to building games.

Why would I say this? Well, think about it from a business point of view. Valve's biggest income generator is not just one game, but their Steam distribution platform. And the single biggest risk facing the companies future is the platform that those games run on. If Microsoft continue to push Windows further down the same path, Valve could find themselves in a tough spot. The biggest risk to Valve is that Microsoft essentially makes gaming on the PC nigh-on impossible with Windows. They will be left with a product they just cannot, or would be allowed, to sell.

Contrast that to a Linux desktop. Here is an environment where you may not have complete control over its direction, but you are at least able to contribute and help make sure that the needs you have for the environment are met. In fact, collaboration is encouraged in the FOSS world. Graphics driver a bit buggy or not performant enough? No need to wait for a third party to fix the problem, get in there and do it yourself. Kernel not handling things the best way possible for your needs? There's no big proprietary conglomerate to fight with and hope they eventually fix it. Spend a bit of money and get devs on it yourself.

Within a FOSS community, Valve can help to ensure their own future. They have a say. They can be a valuable member of the community while still making sure that they can continue to provide the service that makes them their profits. If Valve and other games developers really got stuck in, the Linux desktop can be the ultimate home for PC gamers.

"But gamers won't switch Gareth!" I hear you argue. But why won't they? Linux distributions are free and thanks to advances in ease of use over the last few years, dead easy to setup and install yourself. You really don't need to be some basement geek to make the move these days. In fact, I think most gamers would love to be able to get their hands on an OS that costs them nothing, free upgrades FOR LIFE, and that performs better than what they used to run on. Not to mention the possibility to tweak the crap out of the OS if they so choose to get those extra few frames per second.

Lets just get the list of evidence lined up here so everyone can see exactly what I mean and then you be the judge:


  1. Valve and other games developers have already expressed their concern (to put it conservatively in some cases) with Windows 8 as a gaming platform.
  2. Valve has invested a lot of time, money and effort into porting their games distribution platform (Steam) to Linux as well as a number of other games.
  3. Valve has already invested work in helping improve the quality of FOSS drivers for GPUs.
  4. Valve has already mentioned possible tweaks they want to make to the kernel in order to make it more performant for gaming.
  5. Valve has no way to guarantee that their current platform for delivery of games (Windows) will continue to be reliable for gaming and so can affect their ability to provide their service and make a profit.
  6. Valve may not have complete control over the Linux kernel and FOSS driver implementations but they can have a significant impact on changes to these critical pieces of software to help ensure they CAN continue to provide their service.
  7. Linux based distributions are cheap and (these days) easy to install and setup so there should be little to no barrier for gamers to be able to make the move across
When its all summed up like that, the conclusion I come to is that, if Valve and Gabe have their way, Linux will become the premier choice for gamers on the PC. And with that, it means a helluva lot more users, a bigger market share, and hardware vendors sitting up and taking notice to the point where they can no longer ignore support for Linux based Operating Systems. Valve and Gabe can do more to bring about the vaunted "Year of the Linix Desktop" than any other single act ... bar Microsoft going bankrupt over night.

Monday, November 28, 2011

Thinking Critically #1: Homeopathy

This is the first in a series of articles I've decided to write that takes current controversial topics and looks at them with an unbiased, critical point of view by weighing up the claims and evidence (if any) that may exist out there. At the end, I will also give a run down of my personal views of the topics I talk about in these articles.

As I am sure you could have gathered from the title, this particular post is about Homeopathy.

What is Homeopathy exactly?


There seems to be a lot of misunderstanding as to what homeopathy actually is. People I have spoken to (family and friends) seem to misunderstand it, thinking it means natural or herbal remedies. While a lot of homeopathic remedies do purport to contain natural ingredients, a lot do not. So if homeopathy isn't about natural/herbal treatments, what is it? The easiest way to describe it is to go through the process a homeopathic remedy goes through to be prepared:

1. Identify the condition or affliction you wish to cure. Such as insomnia for example (of which there are many homeopathic remedies).
2. Find an active ingredient that would replicate the symptoms; in the case of insomnia, caffeine would replicate the symptom. This called the mother tincture (undiluted).
3. Dilute the active ingredient at a ratio of 1ml per 100ml. Thus creating a 1C solution as per homeopathic nomenclature.
4. Perform a process, known as succussion by homeopathic practitioners, which is supposed to activate the vital energy of the active ingredient. It involves a sequence of shaking and strikes against a surface.
5. Dilute this diluted, succussed liquid into another 100ml of water at the same ratio as the first dilution and perform succussion again. We now have a 1:1000 solution. In other words for every 1 molecule of the original active ingredient we have 1000 parts of water. This gives us a 2C solution to follow homeopathic nomenclature.
6. Continue this process again another 28 times until we arrive at a 30C solution, the recommended dilution according to the homeopathic founder Samuel Hahnemann. We now end up with a solution that is 10−60. Or in other words 1 part of active ingredient per 1060 of water.

To put that into perspective, "... a patient would need to consume 1041 pills (a billion times the mass of the Earth), or 1034 gallons of liquid remedy (10 billion times the volume of the Earth) to consume a single molecule of the original substance ..." as quoted from Wikipedia.

Why is it purported to work?


So with such a massive amount of dilution why are there so many people that believe it works? Putting aside the erroneous belief that homeopathy is about natural and/or herbal products, the following is how homeopathy is explained to work:

1. The act of succussion releases the active ingredients active energy into the water molecules which have the capacity to "remember" that energy, and the potency of that energy stored within the water molecules is enhanced with further dilutions and succussion.
2. That the electron makeup of the water molecules vibrate at the same frequency as the energy of the water molecules and therefore impart the benefits even with no active ingredient remaining.

There is also a lot of incidental evidence based on the reports from specific individuals that have claimed that they have suffered from a specific affliction that conventional medicines have been unable to cure but after using homeopathic remedies the symptoms and affliction have stopped. You can see people espousing such events on YouTube for example, and these isolated testimonies are often used as an indicator of its efficacy by homeopathic practitioners and those who believe in it.

The testimony of individual, isolated cases is not, however, a valid way to prove the efficacy of any medical treatment. There are just too many unaccounted for and uncontrolled variables to be able to truly determine if it was the homeopathic remedy that cured/alleviated the persons affliction or some other cause. Other possible causes that could be unaccounted for (and this is not a complete list, just what comes off the top of my head):

1. Environmental: Perhaps some aspect of the persons environment changed such as change of season (pollen, temperature, winds, etc) or they are no longer subjected to an external cause such as heavy metals, fungus', spores, etc.
2. Placebo Effect: A known medical phenomenon whereby a person given a treatment and told the treatment will help them feels an improvement even though the treatment was faked (sugar pill, fake surgery, etc). Essentially an anomaly produced by the mind that replicates the expected effects of the treatment had it been real.
3. Daily Habits: Perhaps the person started eating a healthier diet at the same time? Maybe they started an exercise regimen? Perhaps they shower more regularly, use a different route to work, or any number of possible alternatives...

It would also be good to mention that there are many individual, isolated cases where people have shown that homeopathic remedies do not work. These individuals will, for example, take an entire bottle full of homeopathic sleeping pills and show no effect.

What undeniable evidence exists to prove the claims?


I have hunted high and low for any studies by those for and against (and neutral) towards homeopathy. While there seem to be many studies that show homeopathy has no benefit besides that which a placebo would give you (such as this one, this one, this one, this one or even this one), I have yet to find any definitive studies that prove the efficacy of homeopathy greater than a placebo.

While I have read a number of studies that claim to provide undeniable evidence for the efficacy of homeopathy there have been numerous problems with these:

1. The study is not done in an isolated, lab environment and therefore cannot control for other possible effects correctly within statistical norms.
2. Population group of study participants are not randomised validly thereby biasing the end result (one study allowed the participants to choose whether they wanted to be treated with the homeopathic vs the conventional medicine which would bias the placebo effect).
3. The participants aren't limited in the affliction they suffer from, thereby introducing many unknowns into the study that cannot adequately be accounted for (one study used a group of cancer sufferers as the basis of its investigation but included around 10-15 different types of cancer sufferers).
4. Cherry picking of data. Some studies blatantly ignore any results that would show against homeopathy and only choose the results that show for homeopathy.
5. Comparisons of homeopathic treatments to invalid data sets. Some studies will take the results of the treatments and compare them to conventional medicine results from other sources such as other hospitals, or even results in of global averages instead of comparing with data retrieved from similar situations.
6. Separate groups are treated differently. One study had the homeopathic group also receiving additional care from medical staff where the "modern medicine" group did not receive this additional care. The improvement in quality of life of the participants would therefore be biased.

What would be undeniable evidence?


There is a very simple and logical method to attempt to prove the efficacy of any drug. A technique  used all the time by pharmaceutical companies. It goes something like this:

1. Get together a large group of people. The more the better, but no less than say a couple hundred. These people need to all be suffering from the same affliction that your treatment/remedy is supposed to cure in relatively the same intensity if possible.
2. Split these people into two groups with roughly the same number in each group.
3. Tell both groups they will be given a remedy to help cure or alleviate the condition they suffer from.
4. Give only ONE group the actual remedy you want to test while giving the other group a placebo.
5. While the test is underway both groups are treated identically to not bring any bias into the results.
6. Measure the changes in participants over time from both groups by asking the same questions and doing the same tests.

After such a study you will now be able to compare both groups to each other. Seems reasonable to assume that if a remedy/treatment actually worked then the test group should show much better results than the control group. This setup also helps negate the placebo effect to some degree.

But that's not quite enough. This test would need to be done multiple times with different people. Preferably by different scientists as well to help negate any bias the experimenters may have.

To date, there have not been any such tests that prove the efficacy of homeopathy as a valid alternative treatment to modern medicine.

My personal view


I remain completely unconvinced. The first step, I feel, to proving homeopathy's efficacy as a viable medical treatment is to perform the kind of tests and studies as I have detailed above. Every other such study has proven it does no better than the placebo group but homeopathic practitioners cry foul of these studies.

The problem with the homeopathic "culture" is that it is also tends to make conspiratorial accusations. Claims against "Big Pharma" (whatever that is supposed to mean) try to denounce any studies that disprove homeopathy's effectiveness as attempts by the big pharmaceutical companies to protect their own profits. What doesn't seem to be factored in is that, if homeopathic remedies really did work, then the pharmaceutical companies would be the first ones to leap on it as a new revenue producer.

Speaking of big money, there are homeopathic suppliers which charge outrageous sums of money for these alternative and more "effective" remedies. Homeopathy is a big business, making a lot of money, out of peoples ignorance.

There are also the scientific issues to deal with. Those dilutions I talked about would essentially wipe out all possibility for even one molecule of the active ingredient to be present in a remedy. And once the dilution gets far enough, there will even be water molecules that have never came into contact with the original active ingredient. Apparently, it will inherit the "energy" or vibration of the ingredient from the water molecules carried over.

Homeopathic practitioners also will tell you that homeopathic remedies should be used in conjunction with conventional treatments. This serves very well to hide the lack of effect.

And lastly, the biggest reason I try to educate people on homeopathy, is that, unlike other pseudo-sciences like Astrology or the existence of Psychics, we are dealing with peoples health and well being here! People are using homeopathic remedies expecting to be cured and only end up delaying treatments that DO work or dying because there is an industry of quacks, peddling water, that claim it cures you when there is NO evidence it does!

Wednesday, November 23, 2011

Reading the article critically (i.e. properly)

Internet news sources are all fighting for readership, the more readers you have, the more advertising revenue you can generate. And therefore, so many of them now resort to linkbaiting tactics and misleading representation of data in order to "scare" people into reading their supposedly factual news article. The problem is also, that so many people just cannot seem to read between the lines to really see what a lot of these news reports are saying.

Lets take an example. I have just read an article by Juniper Networks about the rise of mobile malware. The title is the usual link-baiting, scare tactic fare; Mobile Malware Development Continues To Rise, Android Leads The Way. The article is here if you want to give it a quick once over. I'm going to cover the title first. Take the word "malware" out of that title (Mobile Development Continues To Rise, Android Leads The Way). Now we can all probably surmise thats true as well. More and more development is happening in the mobile space every day. More and more companies and software developers are jumping on the mobile bandwagon. And, in fact, Android is also leading the way there too. Android Market is having more new apps being developed for it than any other mobile application store.

And this is my first point. If you say X phenomenon is on the increase, you firstly need to determine if thats an anomaly based on the trend in the area it operates in. I don't have figures for this (because, wouldn't you know it, that article didn't give any), but this discussion isn't about proving the article, its about discussing the misleading way things are being reported.

For example, if all mobile application development is seeing an increase of 100%, and at the same time we are also seeing malware development increasing by the same 100%, well, that's not pointing to a unexpected increase in malware. All mobile apps are increasing by that amount. Its just a part of the curve.

What a lot of articles tend to do as well is throw "data" at a reader that make things look worse than they really are, and the use of "% increase from last year" is another popular tactic used. A lot of readers don't really seem to understand how that works. Using the linked Juniper article as an example, they make the claim:

A 472% increase in Android malware samples since July 2011.
Ok. Thanks. But what was the actual number in July, 2011? If there were only 10 malware apps in July, that means now there are only  57.2. A tiny number compared to the vast number of non-malware applications. Why wouldn't they give you the real figures? I'm not privy to what they were thinking, but my first assumption is that the actual numbers are so low that it wouldn't be compelling reading, so they rather use the scarier 472%.

And this practice continues throughout the article. All accurate data, just completely out of context and therefore ... useless!

But along with the use of out-of-context data, writers of these articles like to mix in other data which looks related but actually isn't. Here's an example from the piece:

October showed a 110% increase in malware sample collection over the previous month and a striking 171% increase from what had been collected up to July 2011.

So, the way they collect data about malware is by collecting the actual malware. Kind of like anti-virus software on a computer does it. What they don't mention in the article is that the increase in malware "collection" isn't only a factor of there actually being more malware. Its also a factor of improved methods of detecting and "collecting" them. Those are two very different trends which the article tries to bundle into one in order to help paint a scarier picture.

Readers should (must) read articles far more critically as there are innumerable tactics writers will use to find ways to make you click that link, become outraged at the content, and share it. When you eventually break it down, they are usually full of half truths and data taken out of context.

They find a way to lie by telling you the truth...

(And for the record, with regard to the example article itself, I don't think malware is not an issue on mobile devices, I just don't think its anywhere near as bad as Juniper paints it)

Sunday, November 20, 2011

Thinking about your data model

Web applications are so much more than they used to be these days. With integrations into other web applications through exposed API's, the shifts to Single Sign On mechanisms, data sources that vary from the traditional database backends, no-sql solutions such as the Cassandra's out there and even flat files, the amount of data an application needs to process and be aware of is pretty intense.

And yet most web applications treat every data source except the local database as a second-class citizen. Even though those alternate data sources are critical to the running of the application, its only the database itself that is treated with abstraction within the application's model layer.

Model Layer? Well, any web developer attempting to build a web application in this day and age without the structure of some form of MVC (Model-View-Controller) architecture behind it is asking for a difficult time ahead. MVC imparts a fixed structure to a project with a very sensible separation of concerns in order to make your web application a more maintainable as well as extensible product. If you still work in the days of single files with HTML, business logic and data access all scrunched together, then you are woefully behind the best practices at the moment.

Unfortunately, a lot of the power of the MVC design pattern is diluted by misuse. Hell, I have even caught myself doing it at times. The one aspect I am discussing here is the model (or data) layer, which exists for the sole reason of being a central mechanism to allow you to grab the data you need for your application without having to worry about how that data is implemented, where it is stored, what the database architecture is, or even if its a database at all. And that last point is where things fall short.

A number of web apps I have seen (and BrandFu is not exempt from this unfortunately) will use the model layer exclusively for the applications own database. Any other data source is accessed ad hoc, and in varying ways, all throughout the application's controller and view layers and occasionally within the model but only to extend the abilities of grabbing rows out of the database. The problem with this method is that, if you ever want to decouple from a specific data source, such as a web service for example, and want to switch from consuming that web service to storing and managing that data on your own database, it will be a nightmare.

I am not saying I am not to blame either. I do get caught out with this myself. Developing BrandFu, we found ourselves occasionally making calls to external web service from outside the model layer. And a few weeks ago, we had some interest from a company who would like to have the service installed as a seperate instance over their own network to be able to provide BrandFu services to their own clients but on their own managed servers.

Sounds great but theres one problem. At SYNAQ we have an internally used "API" and Single Sign On (SSO) system called SASY. The BrandFu application itself relies quite heavily on SASY as a data source, but unfortunately for us, the web service requests are scattered around the code in the controller layer. Not all, but a fair number of them.

The solution? Replicate the object model returned from these existing API calls as pseudo-database tables in our symfony schema.yml file. Essentially, map the data returned from these API calls as if they were tables in our local database. symfony can then auto-generate the model classes for these API calls, exactly as it would in the more traditional database model, except we can then go ahead and create methods within these model classes that, instead of resulting in calls to our database, will make the API request to SASY, hydrate the object and send that back.

The result is that any chunk of code that needs that data doesn't know where it came from. It doesn't care. As long as it gets what it wants and can continue processing, why should it? This also encourages re-use a lot more, reduces code complexity, and makes maintenance even easier.

The other benefit, is that if we ever need to move away from an API-based data source for those "tables", well, their schema has already been defined and adding the bit of additional code to make a database query instead of a REST request is a lot simpler. You could even have support for both an API data source or a local database and switch between the two via config.

In fact, that's exactly I will be doing now. BrandFu is going to be transitioned to as clean a data model as possible over the next few weeks. This will simply make the application easier to maintain, easier to extend and easier to implement over a variety of systems and networks.

Monday, October 3, 2011

BrandFu: Distilling the lessons learnt

Over the last few months, yours truly has been a little busy. We recently built and launched a product in the US called BrandFu. I'll let you go look at the site for details on what it actually does.

Along the way, I have also been doing a bunch of reading up on different aspects of our Internet economy. Jeff Jarvis' books was one source. What Would Google Do? is actually a fantastic  look at the way the Internet (and not just Google) has changed the face of our economies and how to leverage the same techniques as the big players in that space to excel at what you want to accomplish in your business. I also just recently bought another book of his as soon as I heard it was available on the Kindle; Public Parts. I haven't read it yet, only just started the intro, but again, it seems like a must read for anyone involved in online economies.

Another addition to the reading list is The Lean Startup by IMVU co-founder, Eric Ries. Again, not finished it, but another good read that does echo a lot of the same thoughts as the other two books.

So with the references out of the way up front, I just wanted to highlight some of the big lessons learnt from our own experiment launching BrandFu into an unknown market, as well as what these books point out.

1. Be prepared to move out of your comfort zone

The first thing to be aware of is that the modern age requires that people can multi-task. These days, if you want to be considered a valuable asset to your organisation, you need to be able to do more than just be an engineer. More than just a designer. You need to be able to dip your hand into marketing and customer service as much as you do actual coding.

Sounds like this has been said before but its surprising how many people, including myself, struggle to move out of that comfort zone. I wanted to develop a product. Dealing with customers was someone else's problem. What I didn't realise was that without customers (via marketing), we could get no feedback (via customer service) which meant we could not develop an application that best suited their needs. That customer interaction was vital to the engineering.

2. Release early. Even if you think you're not ready!

This is one of those scary aspects for developers. We want perfection. We don't want to push code out that might be buggy and feature-poor. But as I said above, you need customer input. You could spend 12 months in a silo developing what you believe is an awesome application, only to have customers come to you afterwards and tell you that it doesn't do what they need.

For BrandFu, we did an experiment first. We went from a very simple Proof of Concept just to see what our stumbling blocks would be, to a rapid 3 month development cycle. And then we released the product to a South African customer-base, leveraging off of SYNAQ's existing clientele. Sure, it was buggy, had features missing we thought would be awesome, but by releasing as early as possible to an admittedly smaller audience, we learnt a ton.

First of all, we learnt that the stuff we thought would be the most popular feature, banner campaigns, actually was secondary to the signature management aspect of the product. If we had siloed ourselves we would have made the banner and campaign management portion of the application absolutely kick ass, but people wanted to use the signature management stuff more. Releasing as early as we did pointed this out to us. We were able to shift focus rapidly and early.

3. Stay Agile! Especially in the first few months

One of the key things we tried to do was to remain as agile as possible. We have a ticket tracking system, JIRA, which is an awesome piece of kit, but we found it slowed us down. While developing our Minimum Viable Product for BrandFu, we were still getting constant feedback from our South African user base. This meant that when new information came in, we had to analyse it, determine what we would do it about it, and then implement. JIRA was slowing us down.

We ended up with the team sitting in one room, around one table with a big whiteboard. Ideas were hashed out immediately, details were scrawled onto the whiteboard and eventually erased when implemented. It meant our turn around time was hours instead of weeks. We could keep on top of changes we needed to make for our launch rapidly.

Now that things have launched and we don't have a crazy deadline, we can switch back to the more staggered process of log ticket, allocate to sprint, execute and so on.

4. Don't get attached to your code

Your customers will feedback (if you let them to of course, and why wouldn't you), and they will tell you things you don't want to hear. That feature you thought would be awesome and then people tell you they don't want it? Say good bye to it. Never be afraid of throwing stuff away. The Lean Startup even has an example of IMVU, where the product had to be altered so dramatically after launch, that it is now an almost entirely different product serving an entirely different need than it started with.

Things will change, things will be added, and things will need to get thrown away. It happens.

5. Anything else?

Sure! There was tons of things learnt. But a blog post is really not a good way to point them out here. The books I mentioned I would consider invaluable reading for anyone looking to develop a commercial application on the web. Hell, if your looking to create ANY business in this era of Internet transactions and communications, you would find good use out of the material in those books.