Tuesday, June 29, 2010

How the internet works

Understanding all the nuances of technology, from how a computer encodes a single bit onto a wire for transmission, all the way up to the protocols everyone knows about, like HTTP and HTML, it's amazing to me that we don't have more connection issues.

I was seriously deliberating this, in my head, a few days ago, and an issue I dealt with today, reminded me of all those things and how the simplest thing can make the whole system go away.

The issue I was trying to resolve for a customer was that a particular webpage refused to load... The cause, as I had determined it, was that an administrator for that webpage had disallowed access from that particular address for whatever reason.

Going through the steps though, I checked through all the different layers and made sure the connection was going okay, that the DNS resolution was working correctly, and that the IP settings were correct.

To think of it, when we type google.com into a browser address bar, the browser instructs the system to establish a connection, the system, knowing it has to resolve the name, examines it's DNS settings, if the DNS servers are not listed as a connected route in the routing tables, it then needs to connect to the default gateway on the default route, in which case, it needs to ARP the default gateway for it's MAC address.

After resolving the mac address of the default gateway, it has to assemble a UDP DNS lookup request for the server, with the default gateway's MAC address, the DNS server's IP address, and the request information containing google.com.

After it receives a response, it need to then check the resulting IP for google, and compare that to local routing tables. It then needs to determine whether it needs to send out the SYN request on the same adapter to the same default gateway, and initiate the TCP session.

All this in approximately 1/10th of a second or less.

That's a lot of things to do in so little time... but we do it so often, without thinking, that the relevance of everything gets lost in translation. This process, and appreciation of it is definitely lost on end-users.

Oh well, I appreciate my DHCP and DNS servers, as well as my local default gateway for everything they do for me, each and every nanosecond.

Saturday, June 26, 2010

Radio

It's obvious that the radio we all knew from childhood of AM and FM is on the way out. If you're looking around at all, you know that. Between Satellite Radio, and Internet Radio, broadcast radio is being crushed out of the market.

Between internet and satellite, there's a lot of market share between them. Currently many people are buying cars with built-in satellite radio systems, which can be easily ignored by anyone competent and wishing to use some type of cell-phone based internet radio. With some service providers offering unlimited data on some cell phone plans, it's becoming more viable to have internet radio in the car.

However, a lot of people have, or will get a car with satellite radio integrated. The integration of the satellite radio plus the complexities of actually getting internet radio into your car, people will probably just pay for the service and forget about it. Which brings me to my next point... The primary company for Satellite Radio is just one company, Sirius Satellite, or XM, which are both the same company now.

Sirius/XM not only offers satellite, which you can listen to on the receiver built into your car, but internet radio you can listen to on any computer, as well as, iPhone, Blackberry and Android applications. The iPhone/BB/Droid markets are really the most significant place for any company to go for distributing a new application which would compete with Sirius/XM. A prime example of this competition are services like Pandora Internet Radio, and Grooveshark. I'm not sure what Pandora is peddling for their mobile application, since it's been a while since Pandora has been available where I live, but with Grooveshark, there's a subscription fee, for Grooveshark VIP status, that allows the use of the mobile app.

I dunno about you, but if I have to choose between listening to Internet Radio all the time (in the car, on the computer, on the go, etc), and Listening to satellite when I'm in the car (or on the go, depending on the receiver), and internet radio everywhere else, for fairly comparable prices.... I'm in favor of Satellite.

I know with my Satellite receiver, I can buy a home kit, that will hook up to my stereo at home; it also has a battery so I can hook up some headphones and listen on the bus, street corner, wherever. When I'm stationary at home, or away from my unit, I can always load up the Android app and listen wherever I happen to be (provided I get a 3g or wifi signal)... and if I happen to be somewhere strange, like a friends house, and want to listen to some tunes on my Sirius/XM subscription, I can login on the website and listen to all my favorite channels over the internet.

While some setups, like Pandora, have the ability to almost generate your own radio stations based on what YOU like, many people don't care to have that level of customization, or they just don't have the time to be able to tweak the settings so the radio stations are just-so. So even with that, they're still using an internet radio service on their 3g data plan... so then there's that.

Overall, I believe there will always be a place for Internet-only dedicated radio... I just don't think it's in the car. Thats why it'll be difficult to get rid of broadcast radio, and even more difficult to overtake satellite.

Thursday, June 24, 2010

HTML and Web 2.fail

For the most part, the concepts of Web 2.0 and other HTML and webpage driven concepts are excellent, but lost on the average user.

I remember back in the golden age of the .com bubble, when geocities was hot, and everyone had to have a page of their own creation. The biggest problem was that 90% of users were using Windows XP, which had never been customized, and Internet Explorer 6. This was the root of all problems.

First, a user would create a webpage (going beyond their means, I might add), and they'd show it to a friend.... to clarify, an unconfigured Windows XP install defaults to a screen resolution of 1024x768, which, at the time, was huge for most users (who were previously used to the Windows 9x default resolution of 640x480... or something).
Add that to the fact that they were using one of the worst browsers in the history of the internet, and you have a recipe for failure.

The page did look fine though, and to all their friends (also using 1024 wide display resolutions and IE 6), it looked fine too. but then a mac user would come along, or someone with a correctly configured display would take a look and they would wonder, what the heck was wrong with your site.

First of all, the sites were programmed by users, who didn't know how to correctly create effects you see on almost every webpage (eg. tables, graphics, etc); nor did the users understand when it was appropriate to use these technologies and when it was appropriate not to. Even something as basic as an unordered list would be beyond the scope of a users understanding.

Then add graphics, and all the various graphic editors, like paint. Where someone would create a background on paint and upload the BMP file to geocities, they would see it fine, where, the rest of us, wouldn't see it at all (because it was still loading)... Even when it did load, if the user viewing it had something non-standard, they would probably see large white bars surrounding the image that the person who created it, had no idea were there.

Now-a-days, this doesn't generally make problems, however, the reason for me reviewing all this fail that so many took part in, is because recently, someone who I regularly follow, due to blind interest in what they're doing, generated a Twitter account that had customized graphics which I couldn't understand... until I resized my browser.

Their background image is set up to be a static image so that as you scroll, it remains in the same place on the screen, which is great right? however, the image itself doesn't repeat, nor is it repetitive so it can be repeated, there's strategically placed text on both margins, and there's a white bar of nothingness just beyond the 1280 pixel line. The creator has a 13" Macbook, which has a native screen res of 1280x800, so, all of this, they cannot see, but the text meant for the right margin, on my screen, is behind the twitter feed, and the right margin just contains a large white, vertical stripe.

I'm not trying to put down or speak ill of this person, obviously they simply do not know, nor do they have the capability to check these things... but it reminds me of the days of old when people would hap-hazardly put things up not realizing that if you view them on any system that's configured slightly different, then it not only doesn't look the same, it downright looks terrible.

To the web designers out there: don't forget to check your page at several different resolutions for consistency, and check it in multiple browsers (IE, Firefox, Safari, Chrome, Opera, etc).

Wednesday, June 23, 2010

Network Speeds - GBe, Wireless N and how they affect you.

A very significant debate in my mind, between different wireless (and wired) network technologies has been relating to effective speed.

What I mean by effective speed is two things; first, the speed you can literally get from the network (after overhead, crosstalk, and other factors). Second, the speed that's useful to the end-user.

Because of this, I end up in quite a conundrum... with server and back-end topologies and technologies, you generally know what kind of speed you'll need and what you can use. When connecting servers together, whether from scratch or to an existing network, you can surmise whether you'll need GBe, 802.2ad linked GBe, or a multi-GB connection (or even a 100MBit connection) for your server, depending on application. For example, a high-performance file server or database, you may want to put some of the more expensive connections onto, especially if the system will be used concurrently by many users, and the drive array can handle multiple gigabits of sustained simultaneous output to multiple destinations...

For servers, the job is pretty easy to deduce what you need, the hard part is not only finding the hardware you need (since 90% of computer shops carry consumer oriented products only), but getting management to sign-off on the purchase...

For client access roles and points, you really have to start debating, is one technology really better than another? let's review.

Almost all network access by end users (or consumers) is internet bound. not many people exist in a world where an intranet even exists, nevermind having servers setup on it, or accessing any "local" resources. With this in mind, I quickly begin to consider two things, first, how many people will be using the service, and what is the WAN speed?

WAN speed: most consumer based systems are using consumer based internet lines, which are generally not terribly fast. In North America, most consumer based broadband lines are between 3Mbit and 15Mbit. There are some exceptions to this, in cases of extremely fast or extremely slow internet lines, but for the most part, they fit into this model. In these cases I have to debate on the validity of buying the latest GBe router or switch, or the newest fanciest dual-band Wireless N router or AP. Since 90% of traffic is going to be internet bound, the fastest any one users connection will go, is 3-15Mbit. Current standards for wired internet technology is 100Mbit full duplex (or 100BaseTX), and currently the standard for wireless is 802.11g (or Wireless G) which runs at 54Mbit. Both of these show standard connection speeds that are 3-8 times FASTER than current internet speeds.

Factor all that into the fact that consumer based internet lines don't really seem to be getting any significant bump in speed, neither now, or in the near future, and you've found yourself in my debate.

If you're not using any resources on your local network, why do you need anything more than a 100BaseTX or 802.11g network? ... to be fair, wireless technologies will never run as fast as advertised, due to the fact that the send and receive happens on the same frequency, making the system half-duplex by nature (meaning you can only send OR receive, not both) but still, a half duplex connection can still sustain, even in high-traffic situations, something near 30-40% of it's maximum bandwidth (except in extreme scenarios).

Additionally, a lot of the technology that is touted as "Wireless N" is really just a beefed up Wireless G, that's been given similar encoding technology to Wireless N (making it possible to encode more data per wavelength of transmission, and therefore increasing throughput)... What I mean is that: 802.11n is designed to run on (or was originally designed to run on) higher frequencies, with shorter wavelengths (eventually, they settled on 5.8Ghz). With shorter wavelengths, and better encoding, it became possible to encode a significant amount of extra data into the stream than wave previously possible.

Allowing Wireless N on the same frequency as Wireless G, causes additional interference, since wireless G would take more time to transmit, and create more noise on the channels that Wireless N would be trying to use, and at the same time, Wireless N would be unintelligible noise to any Wireless G implementations nearby. The real conundrum is that to use Wireless N on 2.4Ghz effectively, you have to bump the channel width from 20Mhz, to 40Mhz. While using 'Channel 6' (the midpoint in N.America for wireless), with a "fat channel" (40Mhz), the radio then crosses over into almost every other wireless frequency, causing interference on every wireless "channel".

The bottom line with 2.4Ghz Wireless N, is that it would only really work in a controlled environment, where there is nearly no other 2.4Ghz networks or devices (this includes cordless phones).

Add that to the fact that the extra speed isn't making anything go faster, because you're using the 150-300Mbit 2.4Ghz Wireless N to access the internet, and you end up with this mis-mash of different, competing technologies, that completely ruin the experience for everyone (since they cause so much interference).

The only true benefit you could ever obtain from Wireless N, is in it's intended implimentation at 5.8Ghz (where there's very little demand, aka interference currently), while using dedicated 5.8Ghz ONLY devices and nodes. Additionally, you would have to use that wireless for accessing local resources; not just that, but you would have to make sure that your AP, and every link between you and the system you're talking to, is GBe, since Wireless N can fully saturate 100Mbit Ethernet... Then, on top of that, you almost have to be accessing an array of drives to really take full advantage of the throughput, since, even good conventional drives max out around 400ish MBit... That's not even touching how useless GBe would be to most users...

Yeah, I understand that, despite the bandwidth being not really necessary, GBe can reduce ping times because the speed of the transaction to transmit each packet is so short, however, the difference in real-world scenarios is negligible at best.

The real baffling thing, for me, is when there's respectable companies, that actually have intranets, with dozens of client systems, roaming profiles, network shares, VoIP, Internet, etc, all connected to the same network fiber, and they're still running on MB Ethernet. Thats. Just. Amazing. Upgrading to GBe in those scenarios would have massive impact, and the upgrade costs would be minimal at best. Since a lot of unmanaged switches are rather cheap, even with massive numbers of ports... Managed switches aren't too far behind in cost.

And really, in those scenarios, isn't the cost of the switch far outweighed by the increase in productivity of the workers? since now they don't have to wait forever for a roaming profile to load before they can actually do some work?

Food for thought.

Monday, June 21, 2010

It's been a while.

I know it's been a while since my last update, I've been thoroughly enjoying android and there isn't much I can say that's poor about it.

I've been intensely invested into mechanics, and therefore, I don't have much productive content to add here. I will say that I've taken a new job that's more technically oriented. Hopefully I can find new and interesting ways to do the same job that everyone else is doing, but faster.

I'm hoping that learning the systems involved in the new job, and all the contributing technologies will cause me to be able to modularize the information I learn and generate some new posts for everyone with some (hopefully) useful information.

There are some points about android where I think they can improve, however, they're actively upgrading the operating system so I don't have a lot of room to complain.

My main point of complaint is that, as a Motorola Milestone owner, I miss out on HTC's Sense home screen... I don't even have the option of buying it to use. Keep this in mind when comparing Android phones. Go check out your local cellular stores and compare a non-HTC Android phone, like the milestone, to an HTC phone, like the Hero, Legend, Desire... etc. Particularly note the unlock and home screens. also check out the media player and examine the differences.

I'm sure that if I rooted my phone, I could probably install a hacked version of sense, however, that's not what I'd prefer to do. All the features I would get from rooting my phone, I either already have, or don't really care to have (besides using hacked software, which is probably illegal anyways).

No matter what you do, I recommend you buy protection for your phone. It's become painfully clear to me how many people drop mobile devices; so protecting your phone, either with a hardshell or soft shell case, is necessary. I use the hardshell Otterbox case for my Milestone, however I know many phones have silicone skins that are also good at absorbing shocks. Choose what you're most comfortable with, because if the case drives you crazy, you're just going to eventually remove it, and then it will just be a costly lump of waste in a corner somewhere.

I wouldn't normally care too much about software versions, however, there are some devices still sporting very old versions of Android (eg, the HTC Hero, at least until recently)... where you'll only get Android 1.5 or 1.6. Normally I say, whatever works, go with that, however, I've had a chance to use an HTC Hero with droid 1.5, and I have to say, that the changes are significant. Do your best to ensure the version of android on the device you buy is at least 2.0 or 2.1 (Eclair); if you don't have at least version 2.0, pinch to zoom and other significant features, will not be available on your device until an upgrade is issued for it.

My last comment on the Milestone is that the network access is incredible. As far as broadband goes, it's meager at best, however, for a cellphone, the access speeds are incredible. The droid (aka Milestone) is using full HSPA on the network I'm attached to in my area. The speeds are so similar to the speed I expect from using the phone on wifi, that there's not a significant enough difference between the two to warrant switching over to the wifi for any reason almost ever. The only time I activate the wifi on my cell is to do updates to the apps on my device (where I'll be downloading several megabytes in a short period of time), and even that is merely to conserve 3G usage on my data plan, since my carrier charges per MB, and allots only so many MB per month for data access.

Android is fantastic; if you're looking for a new smartphone, droid is the way to go. with the thousands of apps in the market, you're sure to find something to suit your gaming, entertainment, and productivity needs... Additionally, the system is fast, with lots of integrated features; the base OS is so good, in fact, that I tend to use the defaults for many things, since it suits me just fine. You are capable of manipulating almost every facet of the device functionality, if you're so inclined.

Sunday, June 13, 2010

The Culture of the Internet

When you speak of Culture, you deal with varying groups of people from different walks of life that are all bound together by a particular trait they all share. Maybe that trait is love of animals, or the non-eating of animals and their by products. Or perhaps those traits are a specific musical genre or way of life.

Whatever it is, these groups have all found their way ONTO the internet. Absolutely none of them come FROM the internet.

I say this, because the internet itself has a culture, and I wouldn't want anyone to confuse internet culture with a culture found on the internet. 90% of cultures, genres, and groups of people originate beyond the internet, and have made their way there.

I would say that the culture of the internet itself centres around places where original content (or O.C.) is produced not necessarily from a specific source, but from all sources. This cultures focal point is, in my opinion, the forums found at 4chan. The anonymous are the culture of the internet, and they've created many laughs for all of us, whether we know it or not.

There are, in my opinion, three sources for information about the culture of the internet...

First, you can join it. by going to /b/ and reading whats posted. unfortunately, posts are rarely up for more than a few hours, so the content is constantly changing. This puts new meaning to the term "here today, gone tomorrow" because everything on /b/ changes every few hours.... everything.

Secondly, if you choose not to join it, there are two notable sources that you can get your information from: the first is Rocketboom. A bit better known than your other option, rocketboom produces a youtube series called "Know Your Meme", in which they review, and explain, the most notable of the output from the internet culture.
Rocketboom will explain all that the internet culture finds funny, while keeping you at a safe distance from the people in the culture.

The Third and final option is to browse around the wiki of Encyclopedia Dramatica (which seems to be down at the time of this posting). This pseudo-encyclopedia contains everything about every meme you could want to know, written by the authors of the meme, and it's contributors. this sometimes makes the information hard to understand, or obscured in some way, since there's a lot of references to varying ways of portraying what they're actually saying (try looking up "starting your own religion" and you'll see what I mean)

Through any, or all, of these methods, you too can be savvy on whatever the internet is talking about. and if you're truly bored, check out the 4cha.... wait, rules 1 & 2 of the internet are prohibiting me from saying anymore...

I've already said too much.