Google’s upcoming IPO

Here is just a random prediction I wanted to put out about Google’s upcoming IPO. I’m probably wrong; but I wanted to put this in ink before the IPO happens…

First of all, I really admire the guys at Google for doing their IPO differently. There is absolutely no reason to let the bankers and underwriters walk away with more money from the IPO than the company itself. When I worked at Netscape, I watched the Netscape collect $24 per share for IPO stock which the business guys then sold on the market for $70+ per share on that very same day. That means that for every $24 that went to Netscape, $46 or more went to the bankers, who never built any product or any value. What a shame for both Netscape the company and the investors in Netscape. Congrats to Google for having the power, foresight and courage to do it differently.

But, I’m a little worried for our friends at Google too. I’m not planning to purchase any of their stock myself (much too risky for me!), but I did check out their IPO site when you could still get a bidder ID. (You can no longer get a bidder ID now) What I saw was pretty daunting. The site was like most prospectuses – it outlined all the hazards, risks, and things which could go wrong. But on top of that, you had to click “accept” though about 4 pages of legalese and terms which were just really long. As an individual investor, I got scared halfway through the process and dropped out of my plan for getting a bidder ID altogether.

Now, consider how traditional IPOs go. If you want to invest, you call your broker, and say “I want to invest in Google”. He says okay, takes your request, and does everything else for you. Its so easy. If you were going to “invest on a whim”, he’s completely happy to take your money and help you do just that. All the legalese was signed and taken care of when you opened your brokerage account years ago; so there is no daunting process.

But Google really puts that process in your face. I think it will scare investors away. I’m not an expert in the field, so I don’t know how many people will be scared away. Will institutional investors be scared? Or just private investors? I don’t know. But I do understand supply vs demand, and the process simply can’t increase demand for their stock. Keep in mind, SEC regulations make it illegal to “pump up” your stock. So, disclosing all the bad stuff is the right thing to do. Arguably, your broker makes it too easy to gloss over the warnings in the more traditional IPOs.

On top of all the process just to get in on this IPO, the IPO is also closed to non-US persons. This further decreases demand for the stock. I don’t know how much, but it can’t be good.

Finally, there are all the recent publicity problems for Google – they forgot to register a bunch of shares they had issued, they accidentally spoke to the press during their quiet period, and they think its worth $100+ per share at the opening, which seems pretty high to a lot of people. Wow – thats a lot of bad stuff!

So, my prediction – I don’t think the stock will maintain a $100 price within 1 week of the IPO. The real question is “how low will the sellers go”. I do not think there will be very much demand to buy at that price, but sellers may be unwilling to sell immediately for less. So, I’m expecting lower-than-expected volumes of trading, and gradual decline of the price; settling around $60 per share in 3 months.

Well, I hope everything goes really well for them; but I won’t feel bad for them in any case. The fact is they built a great product, and they will have success. At some level, whats the difference between a $10, $20, or $30 billion IPO? The investors will whine about the difference, but to the employees that built Google, its a great success no matter what, and they should be proud.

Keep in mind that I have no idea what I’m talking about, and I am basically making all this up.

Can small businesses afford the .NET size?

This is a followup to my entry on May 06 about Managed code and C#. This may sound like I’ve bought into the Microsoft story; but its really based on my experience as an independent software developer. Decide for yourself though…

The question is – as a small business, can you afford the hit of .NET to develop your applications if some of your customers may not be able to install .NET? Will the 23MB download of .NET be so big that it limits your distribution and prevents your product from being a success?

The answer is pretty complicated. Is your target user a personal computer user? Or a corporate user? Do you expect the IT department to install the product or will the user install it directly? You should think about these options before you decide what to do. Unfortunately, as with many technologies, using .NET is almost an all-or-nothing choice.

As for me, I’m a wholehearted believer in C#/.NET at this point, and I think most companies should elect to use .NET, despite the download. Here is why.

First of all, .NET ubiquity is growing. Microsoft claims that they already have over 80million copies of the .NET framework installed. From the Lookout stats, its hard to tell what percentage of users already had .NET installed, but I think its about 35%. How many users didn’t install Lookout due to .NET is almost impossible to calculate. But I do know that by bundling .NET installation into your install (which Lookout optionally does for users that don’t already have it), a lot of users are able to install easily. These users are probably broadband users, however.

The good news is that the .NET framework is being bundled with many new Windows installations today. The availability of .NET is only going to increase.

Here are some reasons you should use .NET.

1. Most developers agree they are more productive in .NET.
Developing in C# last year was eye-opening to me. The fact that two guys could build something as complicated as Lookout in a short period of time is just amazing to me. We’re not geniuses, we’re certainly not rocket scientists, but we were able to do it. A lot of it is thanks to .NET. There is no way we could have built an equivalent set of features in C++ in a similar amount of time.

I do think that Java offers many of the same benefits as C# from a pure development perspective. But the Java Runtime is even less distributed than the .NET framework. So, if you are looking to build in a managed framework where you won’t have to bundle and distribute the 20MB framework, C# is a better bet. For server-side applications, you probably don’t care about the distribution of the framework.

2. .NET is more reliable.
In the case of Lookout, we were building an application that had to exist inside of Outlook. Outlook is known to be one of the more treacherous programming environments out there. MAPI in particular (ask your developer friends) is a bit obtuse, and easy to screw up. Managed code, however, runs within a protected boundary. Because its completely interpreted, the native->managed wrappers put a big blanket around the .NET code. If your .NET code crashes or goes awry, its very easy to catch that crash so that it doesn’t percolate into the Outlook application itself. Its difficult to accidentally corrupt the main application’s memory space. Lookout has received praise for its reliability (although it has its share of bugs too), and a big part of this, I believe, is the fact that as managed code, it can’t screw up its hosted application. Consider if it were C++, however. If you have one bad pointer bug, you’ll take down all of Outlook! Thats a huge liability, responsibility, and just downright scary.

3. .NET is more performant than C++ code.
This may sound most controversial to many people. However, I believe it to be true. As a managed language, you may think, “how can it possibly be faster?” Well, you are right at one level. If your application is just a number-crunching app that wants to drive the CPU as fast as it can, you can probably write a more optimal algorithm in C++. But how many apps have that property? I’d argue almost none- except for pure research or scientific applications.

The performance of most real-world applications these days hinge on a combination of disk IO patterns, network IO patterns, and CPU patterns. This is a complex formula, and is generally difficult to optimize. Talk to any performance expert out there, and they’ll tell you that the way to optimize is to design your app, build it, and then profile, profile, and profile again. Use profiling to figure out where your hot-spots are, and then redesign those portions. This is where C# and .NET crush C++. The fact is that C++ is so complicated to maneuver in that refactoring based on profiling is a very difficult and time consuming process. Most companies simply cannot afford to spend much time here. Developers can discover the major bottlenecks, but except in extreme cases, they do not have the time or resources to redesign them. Instead, they will employ a set of “quick hacks” to work around the bottlenecks. These quick hacks become legacy problems for the codebase, and don’t fix the underlying problem. Over the course of a year, after a few patch releases, the C++ code remains largely stagnant due to cost considerations.

C#, however, can be refactored with much more ease. As problems arise, developers can much more easily rearchitect around performance bottlenecks. That profiling data does not go wasted – there is actually time to redesign large portions of the application without destabilizing the whole thing. Because of this, the 2nd and 3rd generations of a C# project will significantly outperform their C++ counterparts, and also be higher quality.

Case in point (and I am certainly biased here) is the Lookout MAPI indexer. I have tried a lot of the competitors’ products, and I believe the Lookout MAPI indexer is 2-5 times faster than any of the competitor’s indexers. The competition is written in C++. How is this so? We redesigned the indexing algorithm about 3 times based on experience and profiling. The C++ guys can’t keep up.

Conclusion:
Well, if its really faster, has fewer bugs, and takes fewer resources to build, you know my conclusion. Some folks may still want to have their applications target some of the old legacy machines out there (windows 98, etc), and if you really need that, C++ may be for you (although .NET does allegedly run on Win98 too). And, you can’t ignore that .NET does require more RAM; so it may not run as well on the older machines. Anyway, I just hope that Microsoft bundles .NET into a service pack sometime soon so that this whole distribution question can start to go away.

Your Anti Virus Program is a Virus

I had a couple of reports over the last few days that the Lookout install was infected with some sort of trojan or virus. This is very alarming, of course! So we looked into it seriously.

What we found, is a bug in Symantec. On Aug 9th, the corporate edition of their anti-virus software published a new definition file of viruses, which incorrectly diagnosed the Lookout installer as containing a virus. This has apparently been fixed in their Aug 10, rev 23 update of that file.

The particular file that was declared a virus was “nsisdl.dll”. Its a part of the NSIS installer, which is used by Lookout, but was written by the WinAmp team. From reading around the net, you can see that their product (as well as all other products that use NSIS) were suddenly hit by the antivirus product.

What the antivirus product does is to delete the files which contain “bad stuff” – and they do it automatically. And the definition of “bad stuff” is auto-updated behind your back. I sure hope they don’t make mistakes like this very often. What would happen if your trusted anti virus folks made a more serious blunder? What would happen if some hacker figured out how to edit that file (its probably signed to avoid tampering). Shoot – with this powerful antivirus software running on your system, who needs a virus program? If I were a hacker, I’d spend all my time disecting the virus definition file from Symantec, and trying to change it on their site. It would be hard word, but if you were successful, it would be the worst nightmare ever. Symantec has taken care of the distribution problem for you – just flip a couple of bits and that “anti” virus becaomes the virus itself.

But you know, I’m paranoid. I guess false positives are part of the world we live in. Sucks.

Blog Spam

Its a shame, but everyone seems to be doing their best to spam lately. I got two comments today (from the same person, pretending to be two people) with a link to their own advertisement for junk. I guess they are under the ficticious belief that Google will give them better placement if they try to use my site as a link to their spam? I guess they think Google can’t figure that out? Hmm. Spammers.

Anyway, if you think you’ll get away with spamming here, think again. I will crush you.