Sunday, August 28, 2011

Linux for the desktop – it's here now


People have always been interested in alternatives to using the normal Microsoft operating systems. Apple's filled that niche very well with OSX and the surges we have seen in the Mac product line over the last few years.

There is one operating system however that seems to have been reserved only for geeks over the years, but always hinted it might have promise for end users. The concept for “Linux for the desktop” has come up many times, but every time I tried to find a way to use Linux that really applied to “everyone”, it kept coming up short.

Small businesses with people who do not have a background in IT, and who need to control the costs of their computer systems have gravitated towards Microsoft consistently, because even though Linux is “free to use”, the IT resources it has taken to deliver and support it have not been widely available for the SMB sector, except for certain very fortunate companies.

So, that has left a vacuum in the space of Linux for the desktop. I have tried various Linux desktop distributions over the last decade, and every time I loaded one up, I never really felt like it was simple enough to deliver to an end user, and had enough functionality, stability, and was supportable enough at a low cost to justify giving to anyone but a computer technician and some very specific niche users. The exception has been with thin-client terminals running stripped down specialized versions of Linux which connect direct to a Terminal Server to run Microsoft applications.

A company called Canonical released a Linux distribution some time ago called “Ubuntu” which I started following some years ago. I have tried Ubuntu release after release, and still always felt like “this is not quite there yet”. Recently I had a slightly different experience however...

Some time ago, Ubuntu released a version of their Linux operating system called “10.04 Lucid Lynx LTS” - the LTS stands for “Long Term Support” and that is something that stuck out to me.
So I loaded up the Ubuntu Desktop 10.04 to try it out, expecting to be in for a long night of fixing, hacking, and generally beating my head against the wall trying to get everything to work right.

I was shocked to find this version was absolutely nothing like what I expected. Not only did it load up very easily without any issues, but, I found myself able to do almost every task I needed to use my Windows PC for! What even amazed me more, was how easy I was able to quickly get everything up and running, with a great deal of stability. After that, I discovered how Ubuntu Desktop could make me even more productive than I was on Windows for some tasks. This opened a whole new world up.

Probably the #1 biggest reason why this version of Linux works so well for regular end users is because of the “Software Center”. This is available from the “start menu” and with 1 click, you can search for, find, and automatically installed hundreds if not thousands of applications that work perfectly with Ubuntu. These “apps” are typically free too.

Here is a list of applications I use on Ubuntu Desktop frequently:
  • Open Office – can open DOCx, XLSx, PPTx, print to PDF, and I wrote this blog using it.
  • Skype – I even tested it successfully with a variety of webcams and microphones I had. Works great.
  • Empathy Chat Client – this program lets me mash up multiple additional chat networks, such as MSN, Yahoo, Facebook Friends, GTALK, AIM, and many more into 1 unified chatting experience
  • Gwibber – Allows me to mash up “broadcast networks” such as Facebook and Twitter into a unified feed, and also lets me broadcast to multiple social networks simultaneously.
  • Firefox – web browser. Most other freely available browsers are available too, and browsing the web from Ubuntu is a real pleasure
  • Evolution E-mail - I tested it with my Exchange 2003 server and it synced up everything including my calendar and contacts.
  • Filezilla – FTP client
  • Remote Desktop – connections to Microsoft Terminal Servers and RDP sessions
  • Logmein – I am able to remotely control PC's & Mac's using Logmein
  • Printing – Connected with no problems to my wireless HP Color Laserjet over a network connection, and printed to it.
  • Network Files – I had no problem connecting to, mapping, authenticating with, browsing and updating files over SMB network shares running on Windows Server 2003, or Server 2008
  • CD Burning – I was able to easily burn CD's and DVD's
  • Drop Box – Ubuntu runs it great
  • Pandora – Firefox plays it fine, but I have Pandora One, the paid version. I was able to easily install Adobe Air by visiting Adobe's website, and then I was able to easily install Pandora One from Pandora's website. It runs great and I use it all the time.
  • Google Earth – works excellent
  • Pitivi – I had to read the user's manual to figure it out, but since it came with Ubuntu I figured I would try it out. I was able to cut video files up and edit them, which is really nice for it being completely free software.

There are a wide range of other software and tools I have installed but these represent some of the interesting things I always want on every install I have.

So, what kind of platform did I originally install it on? Well, for my first install, I went for a sophisticated gaming rig, just to try to throw it off. A Dell XPS tower built about 4 years ago, with an Intel Dual Core Duo CPU, 4 GB of RAM, dual Nvidia GT 8900's in SLI PCIx and an onboard SATA raid controller.

The install went great. Everything worked right out of the box except I had to activate my Nvida drivers which then let me get dual screens working real quick.
I pushed this hardware platform as hard as I could, loading as many programs at the same time, running dozens of open browser connections, large Word and Excel files, playing videos, listing to music, all at the same time and I could not really get this machine to even stutter for a second.
I decided I wanted to see “how far can I push this Ubuntu?”

I have a variety of other hardware laying around, so I put together a couple test chassis. One had 512 MB of RAM and a socket 478 Celeron chip (older kind) while the other had 1 GB of RAM and a socket 478 P-4. Both booted off USB and both used AGP GeForce MX 440 cards.

While I found the Celeron was underpowered, and always had the CPU pegged, I found the P-4 machine to be quite functional. Actually, I set up an entire axillary workstation for myself on a mere P-4 with 1 GB of RAM and can use it to do most of the tasks I need to do, when I am not in front of my larger rig.

So now, I am able to run all these applications listed above, and many more, on a dual-screened system with only a single core CPU and 1 GB of RAM, and all of this is running on an older platform! I am able to achieve a high level of functionality, and I'm not limited by much of anything really.

Of course different people use computers in different ways, but today, the trend is moving towards more web-enabled applications that can run across many platforms. This means that more and more people can tap into Ubuntu Linux to extend, increase and enhance their computer use.

Right now, businesses and individuals can tap into this same power too, quite easily and cost effectively.

Picture this: For around $100, it is pretty easy to build a single core P4 with 1 GB of ram, no hard disk, and a decent but older graphics card that supports dual screens. In my case, I'm using older recycled or decommissioned CRT's or small older LCD's that can be obtained really cheap, but snap any two monitors to this system and you've got yourself an entire, Linux system that you can do whatever from. Craigslist, pawn shops, flea markets and ebay local, all are great places to find this kind of hardware, including the monitors, which can be had for $20-$30 each for small CRT's picked up locally.

In my case, I'm booting an old Dell Optiplex GX 260 from an 8 GB USB stick, and running it all, without any speed issues. Since this computer has no moving parts except the power supply, and is only a single core, I'm able to run this computer in very harsh environments as well.

So, if you need to extend your computer system, or add an extra computer for yourself, consider looking at Ubuntu as an option. Right now, there are hundreds of PC's being recycled, that can easily be brought up to these specifications for a very small amount of money. These computers would otherwise end up in the trash (hopefully at a computer recycler) but they are typically donated because they are "useless". By re-deploying this useless hardware as useful, low-cost, high productivity workstations, individuals and businesses can do more, for less. Right now, that's a very good thing, worth looking into.

Tuesday, June 14, 2011

Is Facebook really losing users? Judge for yourself:

I recently spotted a headline claiming that Facebook was losing users. A report was published that indicated they lost 6 million US subscribers in 1 month. There's a little chatter on the blogs about "the decline of Facebook" and that sort of thing, but Facebook has also come out strongly saying that the report is flawed.

This perked my interest so I did a little research and found out the following facts, by generating some Internet usage/traffic reports from public, 3rd part data at Alexa.com

Here's what I see:

Facebook has been trending steadily upwards when you look at it's total "reach" across the whole of global Internet use. This graph means that more people worldwide are being reached by Facebook on a very nice upward trend.


This next graph shows a downward trend for people who access Facebook through a search engine. What I see in this graph is more and more people are going to Facebook directly, and not using Google or Bing to find it because they already know were it is or have a shortcut to it.



This graph shows usage of Facebook's site is steady, and people are staying on it more or less for the same amount of time as always, maybe trending up a little right now.


This graph shows traffic to Facebook is spiking, as more and more people are using Facebook for the same amount of time.


And finally, what I see here is, as the popularity of Facebook increases, more and more people are going to be first time visitors and bounce off the home page a few times before they might actually sign up.

I looked over the data at Alexa and I didn't really see anything that makes me think Facebook is on some kind of huge decline, anywhere but look over the data yourself, and maybe there's more ways to interpret it...

Tuesday, April 19, 2011

IT efficiency's dark side

"We have a system whereby corporations have been able to achieve a level of productivity they had before the recession even with 8.8 percent unemployment. They managed to achieve prosperity without anyone else having any prosperity."
- Lawrence Mishel, the president of the Economic Policy Institute.

Since computers have been around in business, what has their ultimate purpose always been? I've said before that it's to give companies a competitive advantage but that is done through increased efficiency which has an effect. When a robot does your job, who needs you?

So now business IT systems have continued to automate more and more jobs, and simplify more and more tasks, and the effect has creeped into the market like a building pressure. Businesses found little need to replace positions they eliminated or reductions they made because IT systems once again played a role in process automation to a point where vanishing jobs let companies start feeling unrealized benefits and potential of their current IT systems.
With their jobs at risk, more and more IT managers began searching for ways to increase efficiency and justify themselves so naturally that led to even more rapid expansion of IT projects and demonstrations of ways to use the current software investments even more efficiently.

The result? You see continued investment in the IT sector fueled by large corporate profits but you don't see these same companies adding very many jobs here.

So I guess there's a dark side to our technology systems we are dealing with. It's a contributor, but only in so much as the human innovation coming from the most motivated people and isn't that how history tends to work anyway?

Thursday, April 14, 2011

Google street view car spotting

Here's a couple pics I took during a recent encounter I had with the google street view car.




Seeing the forest through the trees: The Cloud through The Storm

Every time I turn around now, I see so much news and info about "The Cloud" that I'm starting to get asked more and more frequently, "what is all this Cloud stuff about, Brandon?" so I want to break down some important elements of "The Cloud" for everyone to get a little less nebulous definition of what it is.

You've got to start with a little understanding of why IT people even invented "The Cloud" before you can really understand the different major parts of it, and how they all interact to make this virtual storm that's flooding our living rooms with ad's that make no sense to most people, and baffle many IT people still today.

Back in the 70's and big time in the 80's, mainstream businesses started doing massive deployments of computers. They had to remain competitive, and these systems promised to increase productivity, lower costs, and pay for themselves. Mostly back then, the PC as we know it, was a stand alone computer that people rarely networked. The big players where Unix, and Novell. These systems gave people the ability to work from "dumb terminals" that stored no information, and merely received letters and numbers from the server, and displayed them on the screen for the user.
These kind of "networked" systems operated where all the information was stored on a set of central servers, and then was pushed over serial-cable (very slow) to a user's terminal.

Now back then, software applications where virtually non-existent. You couldn't easily find a great "app" for what you where looking for. The solution was to hire computer programmers, who could customize the software to meet the needs of your specific business.
Inside of most businesses, there aren't really any rules of how to standardize this practice or that. Most businesses of the day came with processes either en grained from "processes from decades ago" or where designed around some set of methods that did -not- previously have computers in the process of.
This meant there was a giant market opportunity for programmers to capitalize on every business because computers could significantly improve business process methodologies.

Over the next decade, more and more software was developed, but the computer system architecture grew at lighting speed. By the 90's, major software vendors had emerged, and by cobbling together all the best-of-breed business processes from their thousands of clients over the last decade, these software giants started selling fully-integrated suites of business applications that emerged on PCs first.

The PC took off and so did the world of computer networking. In the 90's is when we saw the fast development of PC based networks, and the Internet.
The PC had amazing processing power, and run a powerful operating system that you could easily install a wide range of software to. The software developers had written thousands of applications over the last couple decades, and they adapted the best of breed to the PCs. Suddenly, powerful software that end users could use, came to low-cost, networking PCs. Another business and consumer IT spending spree began.

What made PCs even better was their cost was significantly lower than their Unix based counterparts.
Now even smaller businesses could use computers to compete with their larger, wealthier competitors. A large sector of the market started moving towards these PC based systems. The PC offered greater ease of use than character based Unix, great flexibility, the ability to install your own software from a wide range of exciting choices, and the list goes on and on. There where so many reasons to upgrade from the IT systems of the 70's and 80's, that a vast majority of IT purchases during the 90's and beyond, would be made on PCs.

But the PC wasn't the computing utopia some people hoped it would be. It brought all kinds of new issues. For one, data started being accumulated all over the place, software was only accessible on the computers you installed it on, and flexible remote access - a norm in the 80's, became much more difficult when you introduce these graphical interfaces we use today.

Distribution of computer resources and power has come to a point where it's Inefficiencies
had to be countered.

Enter one definition of "The Cloud". The idea of taking the software you run, on your computer, and getting it to run in a manner that makes it distributable, accessible, secure, and in a way that doesn't take your computer.
Actually The Cloud is a model that looks remarkably similar to the IT infrastructure models used in the 80's. Big computers serve actual software applications to end users. Groups of users collaborate together in networks that share certain data between these applications.

But there's so much more too it than that.

The cloud is an idea - a methodology of computing, and it's shaping up to be quite a storm.

The computing methods of the 90's and the last decade are time for a major upgrade. When was the last time you upgraded your computers at work? It was either recently, a couple years ago, or your really due. Computers are constantly going out of style. They have to get replaced all the time. The capitol investments that companies make into their IT hardware alone accounts for billions of dollars spent every year, and the cycle never seams to stop.

Why do you need to upgrade your computers?
  1. My software runs slow
  2. I can't run the latest version of my software on this computer
  3. My hardware is really old and crashes allot
Software is the #1 reason to use computers. The same software powers our productivity. So when our computers run slow, our productivity suffers.

Every year, software gets heavier and heavier, and computers get faster and faster. It's no wonder that running software on our computers slows them down because the newer, faster computers always run today's software great - but who can afford to upgrade all their computers every year? These days, the life of the average PC is getting long and longer, so it's having a big drag on user productivity.

Setting up new computers is extremely time consuming. Given, in large companies with heavy standardization, there's a process in place to help reduce migration complexity because it's a constant task, but in mainstream USA, the average PC migration task can take 4-6 hours, if preformed by a good technician with a good plan. I've seen complicated end user computers take double that many times. So with all these costs, what's the answer?

Confounding things more is when a user's desktop crashes, it's common for some critical data to exist there - in a business network environment it's really hard to keep physical copies and images of all your end user desktops. We can do it, but storing that gets expensive, and the load it puts on the network alone makes it a double edged sword on all but the most heavily invested in gigabit or fiber networks.

Back in the 80's the "Servers" delivered the applications directly to the end users. In the 90's when Microsoft came into the picture big time, that kind of stopped. Suddenly the way to go was moving "files" from the "server" to the PC where the end user's software would process the data.
Higher end applications hosted databases that many users on a network could access at once, but the standard method these software developers used was to install a physical "App" on each computer on the network.

Now, with "The Cloud" the idea is to move that processing load off the end user's desktop, and put it back on the server.
There's two major approaches being worked on right now with this path:
  • Put the Applications them self in "The Cloud" - the user gets them from a web browser
  • Put the PC Operating System (like Windows) directly in the Cloud. The user accesses their environment remotely and runs their software
An example of running your software applications in the cloud might be Salesforce.com - they have a highly integrated, widely deployed "Customer Relationship Management" application that runs entirely from your browser. The big deal is that it can be made to actually integrate with all the software your running in your business today - your database servers like Microsoft Exchange, your accounting systems like Quickbooks, so on and so forth. The high level of integration, coupled with it being a "web based application" made it a huge success when it debuted.

In the case of accessing your desktop environment remotely, many people are familiar with that today by using tools such as Microsoft RDP also known as Terminal Services, or Citrix which falls into a similar category. Other solutions like LOGMEIN, GO2MYPC and others, have given many users untethered, unfettered access to their desktops when their away.
Businesses have been deploying servers that let people gain remote access for years, so that's nothing new. What is new is that now the Internet has become fast and stable enough to really optimize some things for delivering virtual desktops to end users, and letting them do all their computing that way.
Again, that's been going on for years tho already. What really makes this "a new way of doing things?"

The answer is a technology called Virtualization which has emerged as a major game changing technology

over the last ten years. Really the last 6-8...
So you might have heard about companies like VMWARE that sell these products which allow running "virtual" or "emulated" computing environments.
Back in about 2001 had a copy of VMWARE installed on my XP Desktop. I was able to "emulate" virtual hardware as if I had a whole different physical PC sitting there. I created several of these "virtual hardware machines" and then installed a bunch of different operating systems on them. I would start up my XP machine, and then I could run vmware and boot up a Windows 2000 box, Windows 98, NT 4.0, other copies of XP, and I could even boot them all up at the same time and network them all together - and do it all from windows I could just minimize on my XP desktop and I could continue using my computer normally.

This was useful for many purposes, mostly testing things. It really slowed down my computer, and had some stability issues and other problems that made is less than useful for "mission critical server workloads" at the time.

Over the years that got fixed. Then computers got more powerful. And now, big huge datacenters with super crazy fast Internet connections can serve up virtual servers and even desktops, to end users.

So, in this usage scenario of "The Cloud" we are talking about running really big, powerful servers in a big stack, like a "computer cluster" and then chopping up their collective power across "virtual" chunks that are your servers. If a server "node" goes down, the cluster can dynamically reassemble itself with minimal or no interruption. Basically, it's self healing. It's also incredibly redundant and very difficult to break.

But where is "The Cloud"?

It depends. Are you talking about a "Private Cloud?" or a "Public Cloud?"

Here's the deal, there are two major ways the IT community builds clouds:
  • You own it - example VMWARE, Zenith Infotech SmartStyle
  • You access it - example Microsoft Azure, Amazon Web Services, Rackspace
Situations like Salesforce.com or hosted-web based solutions aren't in the list above because I'm not talking about using solutions built IN The Cloud, I'm talking about ways to actually build the cloud itself.

Owning a Private Cloud
Not comfortable with your data living on a set of servers you have no access to, or limited knowledge of the physical geographical location of? Worried about future news stories breaking that say, "major cloud breach! data stolen!" and don't trust encryption? Maybe you have some regulatory issues to deal with, and can't find a public cloud provider who can meet your compliance requirements.

The private cloud is simply a set of computer servers you buy for your business and then install, either onsite, or in a hosted "data center" where you or someone you know at least, typically has physical access directly to your hardware. Companies like VMWARE and Zenith Infotech have pre-fabricated boxed "private cloud" packages that basically consist of a manufactured cluster of servers running a suite of virtualization software.
This hardware is a fixed capitol expense with ongoing maintenence and hosting costs typically.

The Public Cloud
So in this case, you are actually leasing or access a set of computing resources that run on a set of "virtual" infrastructure which is provided by a large hosting provider.
Some very large companies have been building some pretty substantial public datacenters recently. This is really interesting because it gives a broad range of computer users, and businesses, access to a highly scalable, incrementally priced computing resource.

One company who came to the market a couple years ago was Amazon. They had huge datacenter facilities and the in house expertise to deliver this kind of new computer access model. Suddenly companies could lease a server right out of the cloud and pay as low as pennies an hour for it, while it was running.
Not to be outdown, Rackspace joined The Cloud revolution launching a whole suite of similar products on a very similar platform to Amazon.
Amazon, Google, and Rackspace, among others, have built a whole framework of cloud-based products that enable businesses to tap into everything from highly accessible and secure storage to application load balancing and running applications directly from The Cloud.

That's another interesting usage scenario - running your software directly from the cloud without really having the operating system layer in the mix... That is about as alien of an idea as you can get for most IT people, at least it was a couple years ago...

Microsoft decided it wanted to get into this Cloud stuff a while back. They smelted a product called "Azure" which was pretty mysterious when they first started talking about it. They are taking a little gamble here too, because this is an absolutely massive shift in the sands of IT their going for.

It's called "Platform as a Service" and it goes a little something like this:

"What if we could get the programmers to reprogram their software, just a little bit (assuming it was built to run on Windows) and get all the software to basically run from the web browser, and all the databases just get hosted up in our servers, then all the processing just gets delivered, but the user, or user's employer actually, never has to even worry about any "server" operating system or hardware at all - because it's all just taken care of by us, Microsoft, up here in The Cloud."

So this is a huge change - instead of hosting software on a server, your hosting software literally on the fabric of the cloud, and the computing resources are just being dynamically allocated by this infrastructure. All the while, this whole "platform" is being delivered from this huge provider, like Microsoft.

Now Azure is much more than just that. It can deploy server operating system instances now too. That's really helpful for businesses who want to start tapping into Microsoft's new Cloud platform, and don't have this fabled "cloud software" that Microsoft is campaigning it's developers to write. It's called an API - Application Programming Interface.

At this point, why should I even bring up API's ? It's a whole different conversation, except in this context, I think it's important to mention a little something about them...

Microsoft is pushing their API for Azure.
Guess who else is pushing a Cloud based API?

Well just about all the major cloud players out there. Amazon has an API, Rackspace coined one called "Open Stack", Google's got an API for their App Engine service which competes with Amazon Beanstalk, and both Beanstalk and AppEngine are poised to be competitors to Azure itself. So as you can see, the API world for these Cloud providers is heating up.

Microsoft, knowing this, knows that a massive wave of Cloud Programming is on the horizon, so naturally, it just makes sense that along the way, .NET developers, VB types and all the MSDN guys will inevitably add tags for the Azure API because everyone else is fixing to start doing massive programming in Cloud API's for all these providers anyway.

Who are these providers exactly? I've been asked that very question recently, and that's actually what inspired this post - so thanks Frank.

Here's the list of providers who are really getting into the Public Cloud Market:
  • Amazon / Beanstalk, S3, EC2, etc
  • Rackspace / Openstack
  • Google / AppEngine
  • Microsoft / Azure
Some other providers who are doing cloud servers include people like 1and1, Godaddy, GoGrid, and many others. The reason these platforms are all called "cloud" servers, is because they are servers that have some kind of incremental cost and are upgradable basically at the click of a button because they run on virtual infrastructure which is built on a cluster of shared resources at the providers datacenter.

The Cloud is really a storm of methodologies that are changing the way people access and use their software.

The Cloud is nothing new. It's been around for years. It's gotten more press lately, as companies like Cisco and Microsoft put ad's on TV, and conversations about it happen in corporate boardrooms more and more every day.

Hopefully this article helps someone understand The Cloud a little better. It's got a wide range of elements that are still forming. Learn more about tapping into the Cloud from NODE. We can help take your business there and figure out how to navigate this huge change in IT.