Wednesday, March 19, 2008

Just-in-Time vs. Just-in-Case BI Costs

Do you know how much power your BI really needs? More precisely, how much power it needs today at 9 AM, next weekend, and at the last day of the quarter or year? Have you bought the ultra-super-duper machine that handles even the highest usage spikes with ease? Or have you decided to sacrifice performance during these peak hours? Do you wait or waste?

Gooddata approach to this dilemma can be described with two keywords: Stateless & Virtualized.

The Stateless is about our architecture. Our product relies on six generic stateless services. The stateless is important for scalability. We can dynamically add any of the six generic service instances as we need to increase throughput of our BI platform.

Virtualized is how these services are deployed. Virtualization allows us to flexibly add hardware nodes to our computing cloud. We have images of different virtual nodes on hand. We can create a new node and dynamically add it to our computing cloud. The beauty is that this all can happen in just few minutes. And the decommissioning of such node is even faster.

We (and you, as our customer) pay for CPU ticks and storage, so the Stateless & Virtual gives you unmatched cost efficiency. Gooddata offers you access to unlimited computing resources. You can get as much of CPU, storage and network bandwidth as you need. And you pay only for what you are really consuming.

Pay for your BI project on Just-in-Time not on Just-in-Case basis.

Friday, March 14, 2008

Gooddata Collaborative Analytics

US dollar exchange rate goes steadily down for more than 5 years. This works for us who buy our gadgets in the US. We use the "Zasilkova Sluzba" service to deliver stuff to our doorstep in Czech Republic.

However, the declining dollar doesn't make us happy as entrepreneurs. It makes all resources that we buy in Europe more expensive. We sell our products for dollars and buy our resources mostly for Czech crowns and getting less and less bang for one dollar. I mean 100% less bang since 2002.

Fortunately we are the "analytics guys" so we can predict the future and actively hedge ourselves against the declining US currency. Our latest research shows that US dollar reaches zero sometimes around 6/18/2012.



Knowing this, we can optimize our financial operations and get the most out of it. Tons of gadgets plus nice access to local resources.

DOES THIS RING SOME BELL? Have you seen such a "great" analysis before? Numbers are right, their interpretation is absurd. That is why we focus on collaborative analytics. Collaboration works great in such cases. You bet that I'm going to see a bull*it tag and some lampoon comments regarding my brainpower a few seconds after I publish such analysis to the Gooddata platform. The product separates wheat from the chaff using collaboration capabilities like tag rating and commenting .

P.S.: Roman, I apologize for stealing your idea with the dollar analysis. I can't resist. At the end this is also a bit about collaboration. You invented and designed it and just I implemented it. :)

Tuesday, March 11, 2008

World Wide Telescope

I always wanted to explore night sky. I'll never forget an August night when I watched falling stars with my father. We lay on a haystack and the million of stars seemed to be so close. I was about seven years old.

Recently I decided to buy a telescope and started talking about this with my friends. One of them told me about the World Wide Telescope project from Microsoft. I was surprised that they haven't started with a World Wide Microscope first but anyway. You can check it out at http://www.worldwidetelescope.org/. The project was recently announced at TED and it is promised to launch in spring 2008. Isn't it spring already? ;)

Friday, March 7, 2008

Gooddata Academy Awards

Yesterday, Roman showed me his personal AMEX card management application that allows him to download his creditcard transaction history. I started thinking about personal business intelligence at this very moment. One of our first Gooddata project templates should focus on loading these data and analyzing them upside down. Nice help for people who wants a bit more than just few standard charts that they get from AMEX. This one even beats the website log enrichment and analysis project idea that was my favorite until yesterday.

I bet that there are zillions of similar situations and formats. The question is which one is the best theme for the soon-to-be-released Gooddata tutorial? Do you have some nice idea? Send it my way (zd at gooddata dot com). The Gooddata Academy will evaluate your nomination. The cute, little ipod nano is waiting for the winner.

You should definitely participate! You might hate such contests. However, you certainly don't want me to give yet-another ipod nano, pico, video whatever to Roman for the AMEX idea. ;)

Wednesday, March 5, 2008

RUP vs Agile

Gooddata is growing. I see new faces in our office every Monday. The Gooddata "legacy" is often under fire. Our new colleagues come from many different environments and bring new ideas and perspectives. That's how we got to the RUP vs Agile discussion.

We practice agile development (scrum) at Gooddata. We have just proudly started our fifth sprint. Our mantra is design, build & re-factor. Iterate quickly until it is done. And suddenly there is a new Yam (yet-another-Martin in our office) who talks about the old good Rational Unified Process. Analysis, design, implementation, deployment. Requirements, use-cases, class and sequence diagrams, logical and physical models etc.

So where is the truth? Where the agile meets RUP? How to marry concepts of these two worlds? I believe that RUP tells us WHAT we need to deliver (use-cases, diagrams, models, schemas, code and deployments) and Agile shows HOW to deliver all the above. The essential pieces that we can't live without need to go to the first iteration of the RUP delivery cycle. All the rest will follow in subsequent iterations. If an iteration crashes because of a new findings, we simply restart it. If we need to dump everything what we have done in an previous iteration we will dump it. If we keep the iterations very short we are not going to regret such loses. And we need to realize that we do not need to go all the way down from use-case to deployment in every iteration. The first iteration can end up with a half-baked command-line script that pretends some functionality rather than fully fledged component with all belts and whistles. We can get to deployment, REST API or whatever it needs later.

I believe that we can have the best from both worlds. Let's blend the proven methodology, useful deliverables and tools from the RUP with the agility, flexibility and efficiency that comes from the Agile. Quickly define the Gooddata way, start using it knowing that we are going to re-factor it million times in the Gooddata future. :)