Thursday, December 23, 2010

Untangle!

The upcoming, soon-to-be-released, Version 8.1 of Untangle, which (as you may have guessed), inspired the naming of this blog, is finally going to be released... It's undergoing final testing, and the one aspect that I've seen in similar products (such as IPCOP, and similar), is finally being included.

Web Caching!

They just finished the plugin and everything... this is awesome.

What's the big deal? (you may ask) ... well, Web Caching, is something that your browser does automatically. It pulls a page down, and saves it, and if you're like me and keep reloading contect from the same pages over and over, over the course of a few hours, then the browser just reloads the content it already has, rather than download everything all over and over again.

You just said computers already do this, why is this a big deal at all?

... well curious reader, Untangle is a routing system, so it replaces the in-house router in your home (Kind of like a d-link or linksys system), however, because it's a full-fledged computer system it has a lot more options available. Web caching at the router means that if you're sitting at home, and you have guests in your house, and you see that picture that makes you "ZOMG SO AWESOME!!!111one", and you send it to everyone else in the house (presuming you're like me and all your friends bring laptops to your house), then it just reloads from the copy stored on the router.

ALSO, if you're like me, this is important because bandwidth is expensive. So rather than downloading the same content 5 or 6 times, it can be downloaded once from the internet, the router intercepts the requests and responds with the locally cached version. This saves you bandwidth, and probably more importantly, saves you time. As long as it's implemented right, the local web cache is much faster than most web server's response times, and it's much faster at transferring to you than it would be through the web.

Best example of the positive contribution to bandwidth is Windows Updates!! the dreaded evil things. They go out all the time and annoy you constantly saying they need to install, reboot, etc. Now, with web caching, those updates are stored locally on the router, so when your computer goes to download them (sometimes several hundred MB in size) it simply retrieves it from the router, which goes faster, so updating is less painful (but still painful), and it doesn't cost you in your monthly usage (eg update size times number of systems).

This is surely an exciting time, and I think I'll have to dust off one of my old P4 units and reinstall this new version of Untangle...

if you havn't heard of them or used them yet, check them out at http://www.untangle.com/

Sunday, December 19, 2010

Your apps are spying on you.

big surprise.

Remember TextPlus4 that you installed? yep, now ad agencies have your phone's unique ID, location, and sometimes even your gender and age... who knows what else? your contact list?

This article comes as no surprise to me.

I have beef with Apple and Google wiping their hands of the responsibility by having applications require to disclose what permissions they're asking for, because honestly, if you want to use an application, the stuff it doesn't tell you it's doing, doesn't really matter, right?

wrong.

My beef with this is that they're putting the focus on the consumers giving permissions that the software is requesting. A lot of users don't even know what these permissions are, or what they include. "Location" ... okay, so if I turn on GPS, and open that app, are they going to know the address of the house I'm sitting in? Or if I've recently "closed" the application (and it's still running a service in the background), and get in my car and activate my GPS for navigation, are they going to know where I am then? and where I'm going?

The biggest complaint I have with the permissions isn't just that noone seems to understand them, it's that there's no options. You either grant all permissions, or don't. by declining, you're also opt-ing out of using the application.

How about this: The application requests permissions and you choose what permissions it should have access to. For example. If I go to install the game Slice It (which I believe is available on both iPhone and Android), and it requests (these are the actual permissions requested on Android) Your Location (both network and GPS), Network Communication (Full Inet access, Network state, Wifi state), Storage (SD Card access), Phone calls (read phone state and identity), System tools (retrieve running apps, change wifi state).

Why does it need all that?

Do I want Slice It to know where I am? not really. Do I get a choice? not if I want to use the application. Why does it need to know my identity and phone state? does the application want to know if I'm on a call while playing the game? Doubtful.

Do you think that if I asked the developer why they wanted all these permissions, that they would give me a straight answer? Probably not. Additionally, there's no way to revoke permissions other than to remove the application. This is my biggest gripe... Do you really believe that an application, such as Slice It would NOT WORK if there was no location data and no network access? I'm guessing it would, I can put my phone into airplane mode (all radios off) and start up the game, and it would likely run fine.

I get some of the permissions, they make sense, like SD card access to save games and stuff, and Network access to download levels without the need to update the whole application in the market, etc... but I should have the capability of granting or revoking those as I see fit.

The permissions shouldn't be "these are the permissions it wants, is that ok?" it should be that the permissions requested by the application are granted individually, rather than as a group.

But hell, what the hell do I know? I'm just a paranoid consumer right?

Saturday, December 18, 2010

RIP 8700M GT

Everything I buy, I buy on purpose and for a good reason. ... at least I try to.

I suppose I don't do nearly as much research for a toaster as I would for a laptop, but I digress.

The other day, my Alienware Area-51, m15x's graphics card melted. The corner of the board overheated and the board melted, nearly destroying the components on the surface. (in that corner, mainly resistors)

The tragic part is, if I had purchased the 8800M GTX which was available when I bought my system, I would've been stuck in the middle of a bad-batch of video cards... namely, the ones that nVidia recalled. So I'm thankful it lasted this long.

Tragically, this means the card is dead, and has since been removed from my system.

On a more positive note, for those who don't know, the Area-51 M15X was one of the first laptops to offer hybrid graphics. It was the first generation of hybrid graphics. It's a really interesting topic if you look it up on Wikipedia or something... Anyways, here's a run-down. The system had two graphics cards; the 'discreet' 8700M GT (in my case, there were others available), and an integrated x3100 from Intel (GMA945 Chipset). When gaming, crank it up to high with Stealth mode (down-clocking for power saving) off, and the nVidia card races along at full speed... For even better power-savings, switch to "binary GFX" (requires reboot), and enable stealth mode for optimum power savings.

The difference was astounding. Wifi + nVidia + no stealth on batteries was an hour or two, switch to the Intel card, and pop on stealth, even with wifi, it easily doubled play-time with the system. Add in the second battery, you'd be set for a full work-day of activity.

Anyways, I removed my nVidia card from the system (MXM-HE slot) and the system now powers on and runs without an issue. The most notable changes are A) decreased graphics performance (obviously) and B) no HDMI. That's right, the video-out on my system only works with discrete graphics.

Now I'm on a mission to get a price on a new (hopefully 9800M GT) graphics card for my system, that will work with the HDMI output... Last time I got a quote for it, it was a cool $650-ish (with cooler), but since Dell has now fully absorbed Alienware, I'm not certain I can even get a quote anymore. I'll try again on Monday.

Wednesday, December 15, 2010

Let's install McAfee

I file this one under "woes of an IT Professional serving the public"...

it's more PEBKAC than anything else.

So, as some of you may know, I work on the phones, doing remote support for an ISP in the eastern united states. Yesterday, I got a call from a customer who was having some issues getting, and staying, connected.

It was a strange issue, since the customer had no issues getting to most webpages, but couldn't load either of the two he needed to connect his computer to me, even in safe mode (with networking).... I decided, after about 45 minutes, to ask him to uninstall his Antivirus (McAfee). I also inquired about having something installed prior, and he mentioned he had McAfee before, but 'removed' it for the new version. Well, uninstalling worked, long story short, his 'removal' of his previous version wasn't complete, it failed or crashed or didn't complete for some reason, and he still had some of the firewalls from the old version.

I EXPLAINED THIS TO HIM.

We have a tool... the filename is MCPR.exe and it's the McAfee removal tool (there's a similar one for AVG and another for Norton). This tool removes any and all McAfee software from your computer. I was going to use it to uninstall the broken version, but first, wanted to download the installer to get the current version re-added to the system when I'm done. I asked the customer to log into the website where we download it, and he did, and when clicking on the download link, it came up with "error 31", which is something we see sometimes. "We" have a support department that can specifically handle error 31 with a significant amount of ease, and presuming the customer can get in touch with them in a timely fashion, we can have the error resolve in a matter of minutes.

So I send him off, and finish up some of the other work I had to do on his system, and take another call, and I'm working on other peoples computers... by the time I get back to his screen, HES INSTALLING MCAFEE AGAIN.

Yes, let's reinstall a problematic software, so we can run the removal for another piece of software that will just so happen to completely wipe out exactly what you're doing.

and he was all happy about it too, he typed into our support chat "only 7 minutes left!"
I was like.... no, we can't do this, I havn't run the removal. I asked you to resolve the error 31, not reinstall the application.

I immediately cancelled the install, shut it down and started the removal tool.
After that, reboot, got him to re-login to the download page, and re-downloaded and re-re-installed the application.

The system now works as intended... however, there seems to be an issue somewhere on Layer 8.

Sunday, December 12, 2010

FM Radio and Android

There's been some discussion in the past few days whether FM Radio should be supported by google on devices that have capable circuitry, such as the Nexus One.

While there's been a constant and steady outcry for the feature from the community, as I understand it, the feature can be activated by using a non-stock ROM for your Nexus One, something Google has very much, opened the door for by creating the Android OS and making it open source.

If you don't have a custom ROM, you'll at least need a custom application and root access. Neither of which is terribly hard to achieve.

As the story goes, everyone has to have an opinion.

My standpoint is, why not?

I don't think google should have to take the time to push out an FM Tuner application, the community can do that easily, heck, as far as I understand, they already have, but why require users to have root access? My question is, why doesn't google allow the software to work WITHOUT root? The changes would be minor, probably just a matter of adding a module into the kernel to permit the functions and initialize the device, not difficult. Then, when the feature is enabled, if people want it, they can download the 3rd party applications from the community to make FM Work. If it's successful enough, and the media player design team is sufficiently bored, why not have them add the feature to the built-in media player on the google experience?

Makes sense to me. It would ensure the minimum amount of effort by google to make the system work, while maintaining a happy client base by allowing them to use the feature without requiring root, and also, keep their streamlined look and feel of the phone.

Maybe in Honeycomb.

Thursday, December 2, 2010

how-to document

So, one of the companies I work with (contracted), emailed me the other week saying they needed help with some specific aspect of their software. I manage their server, not the software on it, so I got a message to them saying that I don't know the software they're talking about and to contact the support personnel for that software, since it's very specific.

They did and word got back to me that the Tech support from that company wanted access to the server to debug the issue. Okay, not a problem. but that was two weeks ago.

Just today, I was told WHAT access the agent needed. Though the information was ambiguous at best, I made my best estimate on what permissions he needed and setup his accounts.

Now the hard part.

Part of the issue they're experiencing (from what I understand) is that they can't seem to install the client software on a few computers and get it to work... This has been a major problem in the past and we've had to overcome many obstacles with this software because it's so specific.

Some of these obstacles include:

Region settings. - if the Operating system that's running the software is set to any region other than "united states" then, when the client verifies the version with the server, the server will reject it as a version mismatch, since the release date of the software is included with the version number stamp sent to the server.

The error emanates from the fact that the US puts the Day Month and Year in a specific order that no one else seems to use... this is a problem because the company that I work for is Canadian, so some of the systems they use are set to Region "Canada (English)", which formats the date wrong (resulting in a mismatch).

UAC - User Account Control doesn't function correctly when giving permissions to the update installer. So updates always fail when run in user-mode. You have to start the application as an Administrator and log in to receive the updates...

Address - For some reason the software always sets the default server address to "localhost" which is obviously false, and it gives an error every time it's launched. This needs to be changed to the IP address of the server.

ADDITIONALLY, when the client connects, it then needs to access the database, which can be an independent address. So when that attempts to connect, and fails, throws another error that the database server is not "localhost" (which then needs to be changed to the correct IP)

ALSO! The Addressing settings are independent by user, so running the software as an administrator, the software will 'forget' the addressing settings when re-run in user-mode, or when run as a different user on the same machine.

We determined all of these flaws the first few times we installed the software, and by "we" I mean, myself and the couple of other guys who WERE employed by the company (now not), and were working on the project... They new the workarounds but now they're no longer working on the project, so everyone turns to the contracted worker.

It doesn't work and I set it up... therefore I must have done something wrong because it's not straight forward to expand the system... Nope. You did something wrong by firing the only workers that knew anything about the system, then you blame everyone else for not having the knowledge.

So! I wrote a very long email, and included pictures (everyone loves pictures!) to send off to the relevant parties that are now on the project... these, must less technical people than the last bunch, are not very skilled and simply don't have the necessary knowledge of the computer systems they use to really do much debugging, and/or trials. if it doesn't work in an obvious way, they're probably not going to get it to work... at least, when it comes to software.

Oh well, now I have to go do some billing...

Monday, November 1, 2010

You didn't get the virus from your ISP

it amazes me how much misinformation is still going around. Peoples conceptions about how or why they get virii infecting their systems. Since one of the primary focuses of my job is remove malware from client computers, I've become exceedingly good at it. The stuff that amazes me is when people wonder why they have to pay for a service to remove the virus that's now stopping them from getting online.

There are a few major schools of thought on this, I'll go over three, starting with a common one, but not one I'd like to focus on today.

"You're supposed to be providing a service. I can't access that service. This should be free!" ... Since our positioning is as PART of the ISP's infrastructure... (factually not the case)... they believe that if for any reason, they cannot get online, it falls into our (the ISP's) job to fix whatever the issue may be. No exceptions. Unfortunately, we only guarantee internet coming out of the back of OUR modem. If YOUR equipment can't receive the signal and utilize it, well, that falls into our department as much as your broken TV falls into the task of your Cable Co.'s jurisdiction to fix it. AKA, not.

This next one is super common...

"But I have XYZ Security software from you to prevent this type of thing!" ... Okay. Just because you got a virus while using a service we re-sell on behalf of another company, does not entitle you to use a paid service for free because something got past it. If you would've read the service agreement, I'm sure you would have seen a 'best effort' clause... stating that the protection is not 100%, that they can only defend against KNOWN virii. Hense, if the virus is NEW, or not yet known, you can still get it.

I love that one.

... not.

This last one is less common but just as irritating:

"I get internet through you and therefore I got the virus from you, so it's your obligation to fix it" ... okay. You didn't get your virus from us, you got it from the INTERNET. There are a lot of people and places on the internet, and scanning all traffic for virii is an ASTRONOMICAL TASK!! We're not going to do that, nevermind the fact that even if we did, it wouldn't be perfect (see previous point). YOU were the one dumb enough to click the link and run the software, on YOUR equipment... the internet works fine, you're just a dumbass. Just because we provide you with a connection between you and the web doesn't make it our responsibility what you do with that.

Enjoy your virus.

Anyways, I had to rant there briefly, I hope noone minds. Play safe, and keep away from those damn dirty virii.

Saturday, October 2, 2010

Internet Explorer Woes

As time rolls on, I sometimes get to wondering why people still use IE. It's simple enough I guess: buy a computer, use the internet. This happens to be Internet Explorer, and that's fine for most people.

Anyone I know, who seems to think they know anything about computers at all, is using Firefox. A good choice. I use Chrome as my primary browser of choice, and fall back on FF for anything that Chrome seems to choke on (not much). if that doesn't work, I'll fall back to IE8, just to see if it will work, and if that doesn't fix the problem the website is having, I usually chalk it up to webpage design and move on.

HOWEVER, IE has so many quirks and quarks.

Working in IT, it's plainly obvious that IE relies on jscript.dll for the javascript function calls. It's also plainly obvious that the Java runtime included with Windows/IE is terrible, at best. It's almost always replaced with Sun Microsystem's JRE. It's also plainly obvious that without Javascript, almost all flash will not function, since the calls made to make the Flash work, is largely JScript fueled.

There lies your three most problematic components of IE. Now, I'm sure FF isn't terribly better, but seems so, because not as many inexperienced users (who are getting viruses and messing up settings) are using it.

So, if JScript works, and Java works and Flash works, you usually have a good-enough browser for anyone.

Here's the problem. it seems, that 64-bit dll's and 32-bit applications don't mix well.
In Microsoft's "infinite wisdom", Internet Explorer 8, 32-bit and 64-bit, both use the same registry keys for normal operation, which streamlines settings. If IE8 32-bit gets setup for a proxy, then the 64-bit version will also use that same proxy.

The problem lies in the fact that once the 64-bit jscript.dll file takes over the registry key, you're basically FUBAR for getting javascript to run in 32-bit IE. While Javascript will run in the 64-bit IE, nothing else really works well, or at all.

Sure, there's a 64-bit workaround for Flash, and a 64-bit JRE from Sun, but installation is painful at best and nothing, besides those three things would ever work with the browser, probably for many years (possibly ever).

This has all become painfully obvious to me since I had a customer at work who has a copy of Windows 7, Home Premium, x64, with both 32 and 64 bit Internet Explorers, installed.

Personally, I havn't found a good solution for the customer. I was excited to give it a good try today when I saw the chat session (at the time, upwards of 30 hours connected), drop into the queue. I'll also be looking for them tomorrow and the next day, hoping to pick up their chat. (I may even give them a call back to try to resolve it).

I suspect reassigning the jscript.dll to the 32-bit version would fix the issues they're having... I would just need to make sure they won't accidentally drop into the 64 bit version.

Two questions remain for that: first, where the heck is the jscript.dll reference in the registry for IE? second, would the system even have the 32-bit jscript.dll installed?

I'm not sure of either, but I'm excited to try.

Sunday, September 19, 2010

usb.brando.com

The following is an open letter that I've sent to brando, my correspondence with them so far has been sub-par, primarily with people who don't seem to have a concept of the English language, never mind any idea how USB charging works, or the nuances with the differences in voltages.

Dear Brando,

I purchased your USB Wrist Band battery pack, upon receiving it, I was thoroughly impressed with the overall design, and comfort of the device, after charging I was eager to start using it.

Turns out, that the unit you sold and shipped me was defective, only outputting 4.0 of the rated 5.5V (according to your webpage).

I will not be returning this product for a refund, since shipping costs would far exceed the original cost of the product (35 USD), and I will thoroughly ensure that noone I know, or have any contact with, ever buys anything you sell, since, so far, the only responses I've gotten from your customer service department have been bullshit.

So you know, the 4.0V output doesn't even push any of my USB devices into recognising that a power supply is attached, nevermind charge anything.

Thanks for nothing.

Wednesday, September 8, 2010

Women in Technical Fields

This article was posted both on Twitter and on Facebook by the lovely and talented Felicia Day, who, a friend of mine had the recent opportunity to meet at FanExpo, which I wasn't even aware was happening... Next year... Next year.

Aside from all that jealousy about what my friend was able to see/do at FanExpo without me, Felicia, interestingly enough, is making quite the point. If you read the article (and admittedly, I was agreeing with the article so much I never quite made it to the end), it's describing a situation where we're almost forcing women into technical fields. I've always felt forcing an issue is wrong, especially of this nature.

This actually reminds me of an episode of the show "rescue me" that I recently re-watched where the lead character "Tommy" (Denis Leary) goes on a rant regarding racial equality in the FDNY; he makes the very important point that maybe it isn't the FDNY that's not hiring the blacks, jews or whatever into the department, but rather that it's the blacks, jews or what-have-you that simply don't want the damn job.

I would apply that logic here, while I agree with this article. While there's an opportunity here to promote women in the workplace, and certainly there are many women who have the physical capability of working in IT/Tech, as well as the brain power to do it as well (if given proper motivation, interest, etc.), however, just simply do not take enough interest in HOW the Internet works, to actually get into the inner-workings of the field. If these people, regardless of their gender, do not have the interest in the job, then are they really the right people for the job? Why should we, or anyone for that matter, be putting them into a position where they are, perhaps, doing something they may not enjoy?

To be totally blunt, I've always had the mindset that, I don't care if you're black, white, Jewish, Spanish, Mexican, Egyptian, Chinese, Cuban or Hungarian; if you can do the job, and do it well, then you should be doing the job. It's all about the right person for the right job, and getting the job done; It's never been WHO does the job, not to me. And that goes for every profession. Everything from Firefighters, to IT and beyond. If you want the job, and you can do it, then go for it.

I don't see why anyone would want anything different from that.

Sunday, September 5, 2010

Telus Milestone rooting/flashing

I don't want to post much about personal stuff but this is far too interesting/helpful than to keep quiet about. So here goes.

As many will know, I purchased a Motorola Milestone from Telus a while back. When I got it, it was stock updated to Android 2.1 (originally released with 2.0.1), and since then, there was an update to 2.1, done OTA from Telus to correct call quality issues.

I started, very happy with the phone, but after dealing with the brief (but noticable) slow downs and delays, plus the possibility of never being updated to 2.2, plus the possibility of never having the Flash player or any of the 2.2 official updates, I started looking around.

To be clear, the Milestone here in Canada is supposed to be getting 2.2 (Froyo) in Q1 2011, still quite a ways away, I'm impatient.

When I started, I began by 'rooting' my phone... This process was easy, using a paid, but very easy to obtain rooting software "Universal Androot" from the market... After using this, I unsuccessfully attempted to use ROM Manager to swap ROMs, which resulted in paying for software that doesn't actually work on the Milestone. I don't mind, I'll support good developers, regardless.

The best way to root, and set up a Milestone, running the newest Telus ROM, in preparation to reflash it, is to use RSD Lite (available almost everywhere) and get the milestone "vulnerable recovery" SBF file. The installation of this SBF will allow your milestone to boot an update.zip that is not signed by a certified authority (AKA Motorola). After you have that basic file and app prepared, before flashing you need to setup an Update.zip and nandroid on your SD card on the device. I used Open Recovery on this.

In theory, you could use many of the various options for Recovery software... Open Recovery was my choice and I feel it's been a good one.

once open recovery is situated with the update.zip file on your SD card, you're free to apply the vulnerable recovery sbf to your phone using RSD Lite. this process is highly automatic, and only requires you to have the specific files, as well as your phone connected to the system via USB.

but hold on, don't let that phone start back up. Everytime the phone starts, with the Telus ROM, it checks that bootloader's checksum. if the checksum does not match the checksum on file, it reflashes the bootloader to spec. Before this happens, you'll need to boot into open recovery to make a change.

After the phone restarts after applying the vulnerable recovery patch (it will say update OK, then shutdown for a restart), you want to hold your phone's camera button as it's starting, you'll get to the phone with the exclamation point screen (you'll know it when you see it), at that point, hold volume UP, and press the camera button again, you'll get a menu.
Using your d-pad, select "apply update.zip" (a process you'll have to repeat to get into open recovery, all the time).

Open Recovery loads. browse over to console. At the console run the following command:
rm -f /system/etc/install-recovery.sh

this should delete the specific file that will reflash the bootloader/recovery and prevent you from getting back to the nandroid/open recovery.

after that, you're rooted with recovery, leave update.zip, nandroid, and open recovery on your SD card for future reference, and download a ROM of your choice. it may be a good idea to use nandroid to do a back up of your current ROM before deciding to wipe and flash a new one.

Whenever flashing a new ROM delete all user data, dalvik and cache. This should put the phone into a very RAW/Fresh state for the new ROM and prevent a lot of problems you may run into.

Enjoy, happy flashing.

Saturday, August 21, 2010

I'm at a loss

It's not a frequent thing when I'm at a loss on how to fix something, especially with anything electrical.

It's usually a cold day when I give up on something, and I'm usually the last to give up, and I usually give up for good reason. I only had one false-positive, in relation to a bad device, ever. I rid myself of a Linksys LNE100TX, which is a very good NIC, for a D-Link DFE-430TX, also a good NIC, and when I returned back and installed the D-Link, I found the source of my problem, which persisted through NIC changes, however I had garbage-d the Linksys already and was unable to retrieve it.

I've done crazy things with computers, from building something from scratch, to fixing the otherwise unfix-able, that others have given up on as "too frustrating", all the way to re-soldering resistors onto the power-riser on a laptop that wouldn't power-up (it worked afterward by the way).

My point here is, I'm usually the last to give up, and if I give up, I have a good reason. My expertise with all these things is fairly expansive, and only hits a limit when dealing with determining problems on a circuit-by-circuit basis. I have a multimeter, but I'm not always sure what information it's giving me, or how relevant it is.

I recently have been working on a friends Dell XPS m1330, which has a bad video processor (on-board on motherboard), requiring a motherboard replacement. I ordered one, it happened to be from China. The shipping time and method was reasonable, when receiving it, the packing was adequate and the board looks to be in good condition. Upon installing the board, and applying all the screws, cases, cards, thermal paste and coolers (including modifications to the original design, ergo: "copper mod"), I attempted to connect power before I powered up the unit, to test for function, and the charging light didn't come on... that's odd.

I checked, and found the power-light on the PSU wasn't lit. okay? it's plugged in, what gives?

after extensive testing, the new motherboard seems to be causing the power supply to randomly turn off. Probably a fault safety or something. I connected the system to another PSU (variable Voltage, adjustable, universal laptop power supply), and all that accomplished was to reduce the (albeit small) display on the cord, which normally shows the voltage, to a very dim jumble of non-sense data. Removing from the system, the display reset to it's lowest value, 15V, attempting a 15V connection, resulting in the same problem.

Attach either of these power supplies to the original mainboard, that works apart from the graphics being completely unusable, and the result is the same: it works.

I emailed the supplier who claims "it was tested before being sent out"... if that's the case, then sometime between you testing it, and me testing it, it stopped working.

I've handled the motherboard actually BETTER than the original one that came with the system (which I accidentally dropped on the floor from about a foot up), and the original works, the "new" board doesn't. explain that.

The supplier is willing to do an exchange, so the board will be going out on Monday.

I can't help but be a tad bit insulted by the insinuation from the supplier that I'm wrong, and the mobo works. I tested the thing, I even checked the resistance of the original mainboard's pos and neg terminals (incoming to the system), and the resistance between them is very similar (there will be variance between the two due to a tolerance in the resistance of every resistor on the board, eg. 100ohm resistor with +/- 5% tolerance could be anywhere from 95 to 105 ohms in resistance). So, in theory, they should both be very similar. I even tested it with the elbow for the universal power adapter plugged in to check for faults in the power adapter, which I would've been willing to desolder and exchange with the power connector from the original mobo, if I thought it would fix the problem, however, the resistance doesn't change, indicating no short in the connector.

I'm baffled. Apart from a major fault of a circuit that handles the current on the mainboard, that isn't obvious, I don't see why this would be happening.

Oh well, another 2-3 weeks before I can reassemble this with yet another motherboard.

Friday, July 30, 2010

Routing tables

I've been dealing with this problem for a while, and I've only had a few instances where I *thought* it might be the problem with a specific person's internet, but I've never actually SEEN it on any system other than my own.

I work next to people that see this all the time, their solution is to Uninstall the NIC from device manager and get it to re-detect it. This is obviously not the best solution.

I'm not sure where the error comes from but it seems to primarily effect wired interfaces where you will shutdown and restart connected to the same, or a different network. I havn't been able to determine exactly WHY or WHEN it happens, so all of that is theory, but here's what happens.

The routing tables receive an incorrect entry stating that the local area adapter has a default gateway of 0.0.0.0

This entry, while not being assigned by any dhcp, is immune to release/renew attempts made by normal adapter "repair" operations.

Essentially, this entry tells the computer that every possible address on the internet should be sent out onto the local area network. That every possible IP address is local. So your computer, when connecting to Google, (for the purposes of this example, we'll use a common Linksys, NAT, private Class C network with local DNS and DHCP), you would need to go ask your DNS for Google's IP... in this case, your DNS will be an address local to you, such as 192.168.1.1 (most common router IPs), which will respond with something not-local, such as 173.194.32.104 . Your computer then checks it's routing tables and falsely determines that it should be able to connect to Google directly, which it obviously can't do... so it requests a connection and fails (on OSI layer 2) every time.

The easiest solution to the problem is to reboot, uninstall the adapter or similarly clear the routing tables using an indirect method.

The most efficient way is to do a "route delete 0.0.0.0" with added options to tell the route command to delete just the incorrect entry.

MY solution is just to perform a "route delete 0.0.0.0" which will clear ALL default routes, then do an IPCONFIG release and renew.

I don't know where this problem came from but it seems to only affect Windows Vista and Windows 7.

Hopefully that helps some of you folks out there. Happy routing.

Another post about Android

I was reading an Android Spin Article about Why Android will win the mobile wars and I got the thinking of so many things as to why or why not this could be true.

They have an excellent point in regard to devices. While there's only one iPhone (only one current model), there's around 15-20 Android devices on the market at any given time. Though iPhone is more user-oriented rather than developer-oriented, and Android vice-versa, I believe, in the end, it will be Android's developer-oriented approach that will cause developers to CREATE a more user-friendly approach.

Though there are a lot of Android devices around sporting similar technical specs (3.7-4 inch screen, 1Ghz chip, 512 or so of RAM, MicroSD expansion, Android 2.1), aka the Nexus One oriented model, there are phones out there with varying features, more than just the size of the screen, the look of the buttons, and the resolution of the camera. Some have hardware keyboards, like the Motorola Milestone (my device) or the backflip.

As we slowly prepare for the next-gen of android operating systems and associated hardware (rumors of a dual-core 1.5Ghz snapdragon chip in the mix), we're quickly approaching cellphones that are more powerful than some people's home computers. Of course the two are not directly comparible, since desktops use the CISC x86 instruction set and cellphones use a RISC, ARM archetecture (mostly).

These so-called "cellphone wars" are heating up, and the iPhone, while a powerful device, since it is only ONE device against so many, is looking at a very tough battle to stay relevant, as time goes on. Blackberry has their niche business market, and Windows Mobile... well, I'm pretty sure that most Winmo users have been looking for something *not apple* and *not blackberry* to move to for a while, and that may account for some of this Android love. I know that's what pushed me to it.

Thursday, July 15, 2010

Routers

There are so many options for routers around these days... I have different recommendations for everyone.

For those who don't really know what they're doing, get a major brand, generally Linksys, or D-Link, and run through the quick start guide ** DON'T JUST PLUG IT IN AND GO ** Leaving your device unconfigured can lead to a lot of security holes in your local network, so just don't do it.

For those who do know, I've used, and prefer dd-wrt. it's lightweight, works on router-specific hardware, and is remarkably fast. of course, your capability of using dd-wrt depends on what router hardware you're installing it on, so choose wisely. pick something that has the features and capabilities you need from your router, and then see if it's dd-wrt compatible. if not, find alternatives.

Also check the router compatibility, to see if any features are incapable of being used yet.

For anything larger than in-home, SOHO style, I would normally recommend independent router/wifi/switch setups, because if any one goes out, it's pretty trivial to figure out which.

I've also used Untangle, this is good for SOHO or small business setups where they need constant protection, that is centralized, away from user interaction and controls. Untangle requires some pretty heavy hardware so be sure to buy something appropriate for the job it's going to have to do, beyond that, it should be able to do just about anything you need it to... with few exceptions to that.

No matter which solution is best for you, whether listed here or not, be sure to read through the documentation to familiarize yourself with the menus and capabilities of the product.

Happy Routing.

Tuesday, July 13, 2010

TELUS APP for Android

Just a quick note.

I try to orient my blog for the world but this one is canada-specific, though everyone should enjoy the amounts of fail in this one.

I'm a Telus Mobility subscriber, and they call a lot of their self-service options My Telus or some such non-sense. With that said, they released their second Android app onto the market in the past week... the first was a music application (to buy music wirelessly).

This second app is for 'direct access to mytelus' The app, seems to be a launcher for an application or website that does not yet exist. so currently this app does nothing but load a page that says "coming soon"

... why telus? WHY? why not make an announcement online and wait to release the application in whole, or when the service is ready?

This whole thing is failing so hard that it's hard for me to understand much else.

Tuesday, July 6, 2010

What I use.

A lot of people regard me as an authority on IT, Computers and Electronics as a whole. While I'm far from being perfect, I will say that I usually do my research before making a decision.

The most important thing for anyone to do while buying a new gadget, unless you're a first adopter, is to look it up. Look up features and specifications, and most importantly, look up BAD reviews. No matter what happens, the bad reviews will tell you about the parts of the device that may annoy you. Nearly EVERYTHING will have poor reviews somewhere, so look them up, they'll often tell you something that you will not see anywhere. Good reviews will usually just go over features you already know the device has, so skip those.

As an authority to many on all things digital, people are often curious about my gadgets and devices and what makes them so special. I'm usually pleased to tell them all about what I love and hate about the devices I use.

That's going to be the main focus of this posting. What do I use, and more importantly, why?

To preface, often I seek devices with similar traits, things like SD cards, and USB connectors are important to me. Having only one connector that connects to a specific device, in my opinion, is bad, since, if I lose that cable or stop using the device, I can no longer use that transfer method or cable (depending on which I stopped using), making both useless as soon as one becomes useless. So you'll notice that most, or all of my devices are exclusively SD, or include SD functionality; all of my devices include or are compatible with either mini or micro USB. While there are exceptions to this, often, I'll avoid a product BECAUSE it's lacking one or the other.

With that said, I begin. The core of my gear is, was, and probably always will be my cellphone. It is the single most important device day-to-day, and the one device that can do every task in a limited fashion. So I'll break this down into five parts: Video, Pictures, Music, Multimedia (video playback), and other.

My Cellphone: Motorola Milestone, AKA Droid.
(550Mhz, 256MB RAM, 512MB ROM, upto 32GB microSD, HSPA, Android 2.1)

Starting with Other, I'm mainly talking about web-browsing, games, etc. This is the ONLY area that my cellphone is not my best piece of equipment for the job. Instead, my computer works better. However, on the go, my cellphone is the best way to look up maps, browse the web for just about anything. For games, again, my Laptop will win hands-down, but the second best device for that is my cellphone. In this category my cell loses in almost every respect to my computer, as it does in certain other areas.

Laptop: Alienware Area-51 M15X
(2.5Ghz Core2 Duo, 4GB RAM, 320GB HDD, nVidia 8700M GT)

Moving on to Pictures, my cell does a 5MP shot with dual LED flash, which is fairly good. however, the picture quality is really lacking. As a photographer, I would never take pictures from my cellphone seriously. I also have a Canon Powershot A640, which is a 10MP Digital camera with 4x optical zoom. While I'm very disappointed with my Digital camera on the whole (being that I would prefer a DSLR), it produces VASTLY better pictures, and has a MUCH better flash. I've used quite a few shots taken from my Powershot for prints.

I got this camera about 2-3 years ago, refurbished, so I saved a lot of money on it. It's fantastic for the cost, great features, but not nearly enough to satisfy my picture-taking appetite. I tend to stick to Canon because I know the brand, I started with a Canon AE-1 45mm SLR, and I've consistently seen good results coming from Canon; that's something I'll pay for. I've heard and seen good things from Nikon and some other brands too. I will always stray away from Sony and Fujifilm, since each has their own, almost proprietary, memory standards (Memory stick for Sony, and xD for Fuji), which does not fit into my basic qualification of SD memory.

Next up: Video. My cellphone will take 720x480 video at ~30fps (I believe). My Canon Powershot will take video at 640x480 at ~30fps (which is the "square" version of my cellphone's res). I also have a Sanyo Xacti VPC-ZH1, which is a 720p (30fps) digital camcorder. It utilizes SDHC as it's primary storage. As a rule, currently, I avoid dealing with video that's larger than 720p wide screen, at 30fps. Not because I don't like it, or because it's too hard on my computer, but because I don't see a value in it.

To make good use of a video stream that is compressed, it needs to be valued based on image quality versus size, and you can get a much HIGHER QUALITY PICTURE from 720p, despite having a lower pixel count, than you can with 1080p video, even with a higher bit-rate. In my opinion, 1080p is never worth the extra space it's going to take up. unless pixels are the size of a fist, you're not going to see a significant difference between 720p and 1080p streams on similar sized screens (even larger home-theater screens). yes, if you're running an ACTUAL theater, you'll want to have 1080p as a minimum to project onto the several-hundred-inch screens, but for home-use, it is my personal belief that the difference between 720p and 1080p is lost.

With that said, I do believe 1080p has it's place, I just don't deal with video that would ever warrant dealing with a 1080p stream. Additionally, 720p is much easier to manage, has much smaller file size and can produce some stunning results.

For someone who intends to upload this on the internet, a lot of people will lose the detail of 720p in the translation to the compression algorithms used on you tube, never mind if it were 1080p.

So, for my purposes, this Sanyo Xacti camera does a Fantastic job, though I've only recently acquired it, and barely scratched the surface of it's use.

Music. While my Computer plays all kinds of streams from Grooveshark and similar services, my cellphone is great as a limited-use mp3 player. I say limited, because listening to music on my cell drains battery life more than I would like, additionally, because I don't want to have to maintain multiple playlists for multiple devices, I create one big playlist of random songs that I like listening to and dump them on every MP3 player I have, and go from there. So whether I'm listening to one media player or another, I basically get to listen to the same songs no matter what I'm listening to it on.

With that said, I'd rather save my cellphone's battery life for browsing the web, checking Facebook or Twitter, making calls, sending texts, etc. So I have a dedicated music player. The music player I use is the Sirius Satellite Stiletto 2 Unit. It has a microSD slot for expansion, and can record music live from the radio.

Of course, being a Sirius unit, it also has (both Satellite, and Internet Radio access to) many commercial-free music stations. I use this unit in my car, or while walking down the street. If I lose satellite coverage, I jump over to MP3 mode, and play all my recorded favorites (from the radio) randomly mixed with a random collection of favorites from my MP3 folder at home, stored on my SD card.

While the player has its problems, like adding static to an MP3 occasionally, it's fantastic for listening to live music, comedy, news, or whatever. It's capable and has very good battery life, in my opinion.

Next, is Multimedia. With this category, I'm primarily talking about playing multimedia (video) on the go. For this category, I have to reach back into the depths and reveal one of my oldest gadgets; but first, my cellphone can playback multimedia: as a current limitation of Android, the video files need to be in mp4 (AVC/AAC) format, in VGA or 480p res. Since most of my video collections, whether they're movies, TV or other, are AVI (DivX or XviD), this requires a transcoder to be able to play content on my cell, which requires time, etc.

My oldest, and one of my favourite gadgets, has been, and still is, my Dell Axim X51v. With a 624Mhz Intel Xscale CPU, 64MB RAM, 256MB ROM, Intel 2700G Video processor, SD card slot, Compact flash card slot, Wifi, and bluetooth, this 5-year-old Windows Mobile device has a lot to give.

Even now, at the time of this writing, there aren't many Winmo devices on the market that exceed the specifications of the Axim X51v; and running lenny's Winmo 6.1 or 6.5 with TCPMP (The Core Pocket Media Player) with full acceleration on the 2700G including DivX and XviD full-screen (640x480) jitter-free playback, the Axim is a device to trifle with.

Having both SD and Compact Flash is a bonus too. I have a 2GB Compact Flash that isn't compatible with many other devices that I use almost exclusively for multimedia files on this device... I can honestly say, that I am looking for, and have yet to find, a media player that has yet to parallel the capabilities of this device.

I used to use my Axim for my day planning before I bought a smartphone. It was excellent for keeping track of when I had to do what. Additionally, it had an excellent contact list, todo list, and other features that made it great. Now, though it may only be used as a media player, it's still top of it's game.

The only downside to the Axim is it's clearly dated screen. The 3.7" trans-reflective TFT display does not have very good viewing angles or very good contrast; beyond this small downside, the device performs better than most that I've encountered.

That's my basic kit. If I end up going out and need to do something specific, I can take the specific device out with me that I'll need (either the Canon Powershot, Sanyo Xacti, Sirius Stiletto 2, or Axim), otherwise the Motorola Milestone is a good catch-all for anything I need to do.

No matter what, always research anything you want to get. Bad reviews are good indicators of annoyances and downfalls of the product but like everything should be taken with a grain of salt. SD and mini/micro USB are good standards to live by too. Make your own decisions based on your own situation in life.

Sunday, July 4, 2010

Audio Codecs

@nickromyn is someone I follow on Twitter, today, he posted a question about Android and it's capabilities regarding ALAC, the Apple Lossless Audio Codec.

Though Nick is an avid MAC user, he's not completely hopeless when it comes to PCs and related technologies. He's not a complete Apple Fanboy, in the sense that he'll give credit where it's due. Where most Apple Fanboys won't even recognize that a product has any merits that may be in any way shape or form better than their beloved apple product (link NSFW, language).

His message was focused around managing playlists/media using a centralized system (like iTunes), which I'm not sure is universal on all Android based devices.

Before I get to my main point, I'll go over this first: I have the Motorola Milestone, which has the capability of using the motorola media center software, which will do a lot of that; additionally, I'm sure that there's more than a few apps that will do it, if google hasn't made something that will interface with the device already... I'd also be fairly certain HTC has put something out to go along with the Sense media player too, so there's that.

My main point is on Audio Codecs. Since Nick pointed out support for ALAC specifically, I have a few points to make about the matter that I'd like to go over now.

First of all, as far as Lossless audio codecs go, I would never use ALAC, because FLAC exists. FLAC, or Free Loss-less Audio Codec, is an open source developed, lossless codec freely available to anyone, or any developer, to include in their device or project. Most media players have picked up support for this relatively obscure format, and are fully supporting it. Additionally, FLAC support has been incorporated into Andoroid's core system, so the native media player can use FLAC files transparently. ON TOP OF THAT, if you're adamant about using ALAC, there's applications in the market that WILL play ALAC format files.

My second point is file size. I don't care what compressor you use, if you're doing lossless encoding, that's going to take up a lot of space. Additionally, the sound renderer (aka sound card) on your portable device will NEVER be able to render all the detail in a lossless file, and even if it could, the quality lost on the connectors, headphones, integrated amplifier, etc, plus crosstalk and static, you might as well have saved the time and space, and not used lossless.

Which brings me to my next point: "lossy" audio codecs.

MP3, or MPEG-1, Layer 3, is a fantastic format, widely adopted by almost everyone and everything for every purpose you could imagine sound being used for. It renders a fair quality representation of the original media and maintains a low processing overhead, plus reasonable size constraints for the quality rendered.

That being said, it's ancient. MP3 players can date back to the early 90's. and in regard to computers, that's like king tut.

other, better, and more versatile formats have been created. The movement started in 2000, with Vorbis and OGG, A format I still love. It's an entirely open source format (similar to FLAC in concept, just, lossy), but it never caught on. The open source community couldn't compete with the thousands of "MP3" players on the market that simply did not support OGG/Vorbis.

A few years later, MPEG teamed up with ISO and IEC to create AAC, or Advanced Audio Codec, rendering higher quality audio at the same bit-rates as MP3. in direct comparison, there is no down-side to AAC over MP3. What they needed now, was a way to get it into our hearts, minds and "MP3" players.

Being that it was the next major format to be blessed by the MPEG, getting implimentation wasn't entirely difficult. It was slow at first, and the success of AAC relies heavily on it's sister codec for video, AVC (Advanced Video Codec), better known as m4v. Combining AVC and AAC produced the ever-familiar mp4 files, which are multimedia files. Almost every modern cellphone will natively record (and decode) mp4.

And the hard work was done. With cellphones quickly replacing both digital cameras and MP3 players, plus having built-in, usually dedicated (on chip) decoding of AAC and AVC, the MPEG and partners have paved the way for wide-scale implementation of .m4a files.

... now if we could only get people to start USING them. ha.

but honestly, in the race for audio codecs, you're not going to find something better than AAC by any significant margin, anytime soon... and definitely not something you can decode on the fly using an integrated dedicated decoder chip (which significantly reduces power consumption, and therefore reduces overall power drain, which therefore increases battery life).

If you're really determined to use ALAC or FLAC on your android powered device, you're more than welcome to burn the batteries and do it, but you probably wouldn't notice much, if any difference over the m4a version of the file... except of course, with your battery life ;)

To note, I believe most AAC decoder chips will also handle MP3, so fear not, using your trusty old MP3s won't drain your battery dry either... they just might not sound as good as their m4a equivalents.

Tuesday, June 29, 2010

How the internet works

Understanding all the nuances of technology, from how a computer encodes a single bit onto a wire for transmission, all the way up to the protocols everyone knows about, like HTTP and HTML, it's amazing to me that we don't have more connection issues.

I was seriously deliberating this, in my head, a few days ago, and an issue I dealt with today, reminded me of all those things and how the simplest thing can make the whole system go away.

The issue I was trying to resolve for a customer was that a particular webpage refused to load... The cause, as I had determined it, was that an administrator for that webpage had disallowed access from that particular address for whatever reason.

Going through the steps though, I checked through all the different layers and made sure the connection was going okay, that the DNS resolution was working correctly, and that the IP settings were correct.

To think of it, when we type google.com into a browser address bar, the browser instructs the system to establish a connection, the system, knowing it has to resolve the name, examines it's DNS settings, if the DNS servers are not listed as a connected route in the routing tables, it then needs to connect to the default gateway on the default route, in which case, it needs to ARP the default gateway for it's MAC address.

After resolving the mac address of the default gateway, it has to assemble a UDP DNS lookup request for the server, with the default gateway's MAC address, the DNS server's IP address, and the request information containing google.com.

After it receives a response, it need to then check the resulting IP for google, and compare that to local routing tables. It then needs to determine whether it needs to send out the SYN request on the same adapter to the same default gateway, and initiate the TCP session.

All this in approximately 1/10th of a second or less.

That's a lot of things to do in so little time... but we do it so often, without thinking, that the relevance of everything gets lost in translation. This process, and appreciation of it is definitely lost on end-users.

Oh well, I appreciate my DHCP and DNS servers, as well as my local default gateway for everything they do for me, each and every nanosecond.

Saturday, June 26, 2010

Radio

It's obvious that the radio we all knew from childhood of AM and FM is on the way out. If you're looking around at all, you know that. Between Satellite Radio, and Internet Radio, broadcast radio is being crushed out of the market.

Between internet and satellite, there's a lot of market share between them. Currently many people are buying cars with built-in satellite radio systems, which can be easily ignored by anyone competent and wishing to use some type of cell-phone based internet radio. With some service providers offering unlimited data on some cell phone plans, it's becoming more viable to have internet radio in the car.

However, a lot of people have, or will get a car with satellite radio integrated. The integration of the satellite radio plus the complexities of actually getting internet radio into your car, people will probably just pay for the service and forget about it. Which brings me to my next point... The primary company for Satellite Radio is just one company, Sirius Satellite, or XM, which are both the same company now.

Sirius/XM not only offers satellite, which you can listen to on the receiver built into your car, but internet radio you can listen to on any computer, as well as, iPhone, Blackberry and Android applications. The iPhone/BB/Droid markets are really the most significant place for any company to go for distributing a new application which would compete with Sirius/XM. A prime example of this competition are services like Pandora Internet Radio, and Grooveshark. I'm not sure what Pandora is peddling for their mobile application, since it's been a while since Pandora has been available where I live, but with Grooveshark, there's a subscription fee, for Grooveshark VIP status, that allows the use of the mobile app.

I dunno about you, but if I have to choose between listening to Internet Radio all the time (in the car, on the computer, on the go, etc), and Listening to satellite when I'm in the car (or on the go, depending on the receiver), and internet radio everywhere else, for fairly comparable prices.... I'm in favor of Satellite.

I know with my Satellite receiver, I can buy a home kit, that will hook up to my stereo at home; it also has a battery so I can hook up some headphones and listen on the bus, street corner, wherever. When I'm stationary at home, or away from my unit, I can always load up the Android app and listen wherever I happen to be (provided I get a 3g or wifi signal)... and if I happen to be somewhere strange, like a friends house, and want to listen to some tunes on my Sirius/XM subscription, I can login on the website and listen to all my favorite channels over the internet.

While some setups, like Pandora, have the ability to almost generate your own radio stations based on what YOU like, many people don't care to have that level of customization, or they just don't have the time to be able to tweak the settings so the radio stations are just-so. So even with that, they're still using an internet radio service on their 3g data plan... so then there's that.

Overall, I believe there will always be a place for Internet-only dedicated radio... I just don't think it's in the car. Thats why it'll be difficult to get rid of broadcast radio, and even more difficult to overtake satellite.

Thursday, June 24, 2010

HTML and Web 2.fail

For the most part, the concepts of Web 2.0 and other HTML and webpage driven concepts are excellent, but lost on the average user.

I remember back in the golden age of the .com bubble, when geocities was hot, and everyone had to have a page of their own creation. The biggest problem was that 90% of users were using Windows XP, which had never been customized, and Internet Explorer 6. This was the root of all problems.

First, a user would create a webpage (going beyond their means, I might add), and they'd show it to a friend.... to clarify, an unconfigured Windows XP install defaults to a screen resolution of 1024x768, which, at the time, was huge for most users (who were previously used to the Windows 9x default resolution of 640x480... or something).
Add that to the fact that they were using one of the worst browsers in the history of the internet, and you have a recipe for failure.

The page did look fine though, and to all their friends (also using 1024 wide display resolutions and IE 6), it looked fine too. but then a mac user would come along, or someone with a correctly configured display would take a look and they would wonder, what the heck was wrong with your site.

First of all, the sites were programmed by users, who didn't know how to correctly create effects you see on almost every webpage (eg. tables, graphics, etc); nor did the users understand when it was appropriate to use these technologies and when it was appropriate not to. Even something as basic as an unordered list would be beyond the scope of a users understanding.

Then add graphics, and all the various graphic editors, like paint. Where someone would create a background on paint and upload the BMP file to geocities, they would see it fine, where, the rest of us, wouldn't see it at all (because it was still loading)... Even when it did load, if the user viewing it had something non-standard, they would probably see large white bars surrounding the image that the person who created it, had no idea were there.

Now-a-days, this doesn't generally make problems, however, the reason for me reviewing all this fail that so many took part in, is because recently, someone who I regularly follow, due to blind interest in what they're doing, generated a Twitter account that had customized graphics which I couldn't understand... until I resized my browser.

Their background image is set up to be a static image so that as you scroll, it remains in the same place on the screen, which is great right? however, the image itself doesn't repeat, nor is it repetitive so it can be repeated, there's strategically placed text on both margins, and there's a white bar of nothingness just beyond the 1280 pixel line. The creator has a 13" Macbook, which has a native screen res of 1280x800, so, all of this, they cannot see, but the text meant for the right margin, on my screen, is behind the twitter feed, and the right margin just contains a large white, vertical stripe.

I'm not trying to put down or speak ill of this person, obviously they simply do not know, nor do they have the capability to check these things... but it reminds me of the days of old when people would hap-hazardly put things up not realizing that if you view them on any system that's configured slightly different, then it not only doesn't look the same, it downright looks terrible.

To the web designers out there: don't forget to check your page at several different resolutions for consistency, and check it in multiple browsers (IE, Firefox, Safari, Chrome, Opera, etc).

Wednesday, June 23, 2010

Network Speeds - GBe, Wireless N and how they affect you.

A very significant debate in my mind, between different wireless (and wired) network technologies has been relating to effective speed.

What I mean by effective speed is two things; first, the speed you can literally get from the network (after overhead, crosstalk, and other factors). Second, the speed that's useful to the end-user.

Because of this, I end up in quite a conundrum... with server and back-end topologies and technologies, you generally know what kind of speed you'll need and what you can use. When connecting servers together, whether from scratch or to an existing network, you can surmise whether you'll need GBe, 802.2ad linked GBe, or a multi-GB connection (or even a 100MBit connection) for your server, depending on application. For example, a high-performance file server or database, you may want to put some of the more expensive connections onto, especially if the system will be used concurrently by many users, and the drive array can handle multiple gigabits of sustained simultaneous output to multiple destinations...

For servers, the job is pretty easy to deduce what you need, the hard part is not only finding the hardware you need (since 90% of computer shops carry consumer oriented products only), but getting management to sign-off on the purchase...

For client access roles and points, you really have to start debating, is one technology really better than another? let's review.

Almost all network access by end users (or consumers) is internet bound. not many people exist in a world where an intranet even exists, nevermind having servers setup on it, or accessing any "local" resources. With this in mind, I quickly begin to consider two things, first, how many people will be using the service, and what is the WAN speed?

WAN speed: most consumer based systems are using consumer based internet lines, which are generally not terribly fast. In North America, most consumer based broadband lines are between 3Mbit and 15Mbit. There are some exceptions to this, in cases of extremely fast or extremely slow internet lines, but for the most part, they fit into this model. In these cases I have to debate on the validity of buying the latest GBe router or switch, or the newest fanciest dual-band Wireless N router or AP. Since 90% of traffic is going to be internet bound, the fastest any one users connection will go, is 3-15Mbit. Current standards for wired internet technology is 100Mbit full duplex (or 100BaseTX), and currently the standard for wireless is 802.11g (or Wireless G) which runs at 54Mbit. Both of these show standard connection speeds that are 3-8 times FASTER than current internet speeds.

Factor all that into the fact that consumer based internet lines don't really seem to be getting any significant bump in speed, neither now, or in the near future, and you've found yourself in my debate.

If you're not using any resources on your local network, why do you need anything more than a 100BaseTX or 802.11g network? ... to be fair, wireless technologies will never run as fast as advertised, due to the fact that the send and receive happens on the same frequency, making the system half-duplex by nature (meaning you can only send OR receive, not both) but still, a half duplex connection can still sustain, even in high-traffic situations, something near 30-40% of it's maximum bandwidth (except in extreme scenarios).

Additionally, a lot of the technology that is touted as "Wireless N" is really just a beefed up Wireless G, that's been given similar encoding technology to Wireless N (making it possible to encode more data per wavelength of transmission, and therefore increasing throughput)... What I mean is that: 802.11n is designed to run on (or was originally designed to run on) higher frequencies, with shorter wavelengths (eventually, they settled on 5.8Ghz). With shorter wavelengths, and better encoding, it became possible to encode a significant amount of extra data into the stream than wave previously possible.

Allowing Wireless N on the same frequency as Wireless G, causes additional interference, since wireless G would take more time to transmit, and create more noise on the channels that Wireless N would be trying to use, and at the same time, Wireless N would be unintelligible noise to any Wireless G implementations nearby. The real conundrum is that to use Wireless N on 2.4Ghz effectively, you have to bump the channel width from 20Mhz, to 40Mhz. While using 'Channel 6' (the midpoint in N.America for wireless), with a "fat channel" (40Mhz), the radio then crosses over into almost every other wireless frequency, causing interference on every wireless "channel".

The bottom line with 2.4Ghz Wireless N, is that it would only really work in a controlled environment, where there is nearly no other 2.4Ghz networks or devices (this includes cordless phones).

Add that to the fact that the extra speed isn't making anything go faster, because you're using the 150-300Mbit 2.4Ghz Wireless N to access the internet, and you end up with this mis-mash of different, competing technologies, that completely ruin the experience for everyone (since they cause so much interference).

The only true benefit you could ever obtain from Wireless N, is in it's intended implimentation at 5.8Ghz (where there's very little demand, aka interference currently), while using dedicated 5.8Ghz ONLY devices and nodes. Additionally, you would have to use that wireless for accessing local resources; not just that, but you would have to make sure that your AP, and every link between you and the system you're talking to, is GBe, since Wireless N can fully saturate 100Mbit Ethernet... Then, on top of that, you almost have to be accessing an array of drives to really take full advantage of the throughput, since, even good conventional drives max out around 400ish MBit... That's not even touching how useless GBe would be to most users...

Yeah, I understand that, despite the bandwidth being not really necessary, GBe can reduce ping times because the speed of the transaction to transmit each packet is so short, however, the difference in real-world scenarios is negligible at best.

The real baffling thing, for me, is when there's respectable companies, that actually have intranets, with dozens of client systems, roaming profiles, network shares, VoIP, Internet, etc, all connected to the same network fiber, and they're still running on MB Ethernet. Thats. Just. Amazing. Upgrading to GBe in those scenarios would have massive impact, and the upgrade costs would be minimal at best. Since a lot of unmanaged switches are rather cheap, even with massive numbers of ports... Managed switches aren't too far behind in cost.

And really, in those scenarios, isn't the cost of the switch far outweighed by the increase in productivity of the workers? since now they don't have to wait forever for a roaming profile to load before they can actually do some work?

Food for thought.

Monday, June 21, 2010

It's been a while.

I know it's been a while since my last update, I've been thoroughly enjoying android and there isn't much I can say that's poor about it.

I've been intensely invested into mechanics, and therefore, I don't have much productive content to add here. I will say that I've taken a new job that's more technically oriented. Hopefully I can find new and interesting ways to do the same job that everyone else is doing, but faster.

I'm hoping that learning the systems involved in the new job, and all the contributing technologies will cause me to be able to modularize the information I learn and generate some new posts for everyone with some (hopefully) useful information.

There are some points about android where I think they can improve, however, they're actively upgrading the operating system so I don't have a lot of room to complain.

My main point of complaint is that, as a Motorola Milestone owner, I miss out on HTC's Sense home screen... I don't even have the option of buying it to use. Keep this in mind when comparing Android phones. Go check out your local cellular stores and compare a non-HTC Android phone, like the milestone, to an HTC phone, like the Hero, Legend, Desire... etc. Particularly note the unlock and home screens. also check out the media player and examine the differences.

I'm sure that if I rooted my phone, I could probably install a hacked version of sense, however, that's not what I'd prefer to do. All the features I would get from rooting my phone, I either already have, or don't really care to have (besides using hacked software, which is probably illegal anyways).

No matter what you do, I recommend you buy protection for your phone. It's become painfully clear to me how many people drop mobile devices; so protecting your phone, either with a hardshell or soft shell case, is necessary. I use the hardshell Otterbox case for my Milestone, however I know many phones have silicone skins that are also good at absorbing shocks. Choose what you're most comfortable with, because if the case drives you crazy, you're just going to eventually remove it, and then it will just be a costly lump of waste in a corner somewhere.

I wouldn't normally care too much about software versions, however, there are some devices still sporting very old versions of Android (eg, the HTC Hero, at least until recently)... where you'll only get Android 1.5 or 1.6. Normally I say, whatever works, go with that, however, I've had a chance to use an HTC Hero with droid 1.5, and I have to say, that the changes are significant. Do your best to ensure the version of android on the device you buy is at least 2.0 or 2.1 (Eclair); if you don't have at least version 2.0, pinch to zoom and other significant features, will not be available on your device until an upgrade is issued for it.

My last comment on the Milestone is that the network access is incredible. As far as broadband goes, it's meager at best, however, for a cellphone, the access speeds are incredible. The droid (aka Milestone) is using full HSPA on the network I'm attached to in my area. The speeds are so similar to the speed I expect from using the phone on wifi, that there's not a significant enough difference between the two to warrant switching over to the wifi for any reason almost ever. The only time I activate the wifi on my cell is to do updates to the apps on my device (where I'll be downloading several megabytes in a short period of time), and even that is merely to conserve 3G usage on my data plan, since my carrier charges per MB, and allots only so many MB per month for data access.

Android is fantastic; if you're looking for a new smartphone, droid is the way to go. with the thousands of apps in the market, you're sure to find something to suit your gaming, entertainment, and productivity needs... Additionally, the system is fast, with lots of integrated features; the base OS is so good, in fact, that I tend to use the defaults for many things, since it suits me just fine. You are capable of manipulating almost every facet of the device functionality, if you're so inclined.

Sunday, June 13, 2010

The Culture of the Internet

When you speak of Culture, you deal with varying groups of people from different walks of life that are all bound together by a particular trait they all share. Maybe that trait is love of animals, or the non-eating of animals and their by products. Or perhaps those traits are a specific musical genre or way of life.

Whatever it is, these groups have all found their way ONTO the internet. Absolutely none of them come FROM the internet.

I say this, because the internet itself has a culture, and I wouldn't want anyone to confuse internet culture with a culture found on the internet. 90% of cultures, genres, and groups of people originate beyond the internet, and have made their way there.

I would say that the culture of the internet itself centres around places where original content (or O.C.) is produced not necessarily from a specific source, but from all sources. This cultures focal point is, in my opinion, the forums found at 4chan. The anonymous are the culture of the internet, and they've created many laughs for all of us, whether we know it or not.

There are, in my opinion, three sources for information about the culture of the internet...

First, you can join it. by going to /b/ and reading whats posted. unfortunately, posts are rarely up for more than a few hours, so the content is constantly changing. This puts new meaning to the term "here today, gone tomorrow" because everything on /b/ changes every few hours.... everything.

Secondly, if you choose not to join it, there are two notable sources that you can get your information from: the first is Rocketboom. A bit better known than your other option, rocketboom produces a youtube series called "Know Your Meme", in which they review, and explain, the most notable of the output from the internet culture.
Rocketboom will explain all that the internet culture finds funny, while keeping you at a safe distance from the people in the culture.

The Third and final option is to browse around the wiki of Encyclopedia Dramatica (which seems to be down at the time of this posting). This pseudo-encyclopedia contains everything about every meme you could want to know, written by the authors of the meme, and it's contributors. this sometimes makes the information hard to understand, or obscured in some way, since there's a lot of references to varying ways of portraying what they're actually saying (try looking up "starting your own religion" and you'll see what I mean)

Through any, or all, of these methods, you too can be savvy on whatever the internet is talking about. and if you're truly bored, check out the 4cha.... wait, rules 1 & 2 of the internet are prohibiting me from saying anymore...

I've already said too much.

Monday, May 31, 2010

OWA Pictures still missing!!

So, After my previous fix, the OWA pictures started not-loading again.

I moved the contents of the most recent folder under the program files/microsoft/exchange/client access/owa directory to a safe place, for safe keeping, and issued a command to create a junction, with the same name as the directory that I just moved, linking it to the previous version.

to do this, I used the command line command: mklink /J

or something of the sort.

This effectively copied/pasted the contents of the previous version into the "current version"s directory.

OWA works again.

Of course, I checked to make sure the application settings were correct in IIS Manager first...

good luck to all those who many be experiencing this problem

Saturday, May 22, 2010

Google

With a new phone, powered by the operating system, made by google; and having used google for just about everything from directions, to mail, to organizing my day, to talking to friends... I just wanted to put out a thank you.

Thanks Google.

For winning over our hearts and minds by being more than just a good search engine.

Friday, May 21, 2010

Exchange OWA pictures missing!

After a recent update, automatically applied to my server as of the last time it was active; likely incomplete, my OWA's pictures stoppped working.

I finally resolved the problem.

You can go through and debug every little error, if you'd like, and I recommend it, a smooth OWA is a happy OWA, but what I found the problem was, was pictures that were missing.

The confusing part about OWA and Exchange's web presence (Client Access), is that it's all so intricate that it's hard to decode. Take a look at your %programfiles% folder, under Microsoft\Exchange Server\ClientAccess\Owa\

You'll see some base files and a series of folders, mine all started with 8.1.###.#

I have about five different versions of the OWA server installed. at least, that's what I'm gathering from the different version numbered folders. I would presume that the ASPX files in the root, dynamically load the newest OWA revision in the folder.

An easy and quick fix would be to revert back to an old OWA, just delete the newest (highest numbered) folder, and voila, it should work... My problem was that I was missing the pictures from owa and owa lite.

I since havn't gotten OWA premium working, but I suspect it might be a problem with my IE on this system, I have to do more investigating before I figure it out. Regardless, to fix a problem like the pictures, go into the 2nd newest folder (where I presume you had OWA access with everything working as expected) copy the themes folder, and paste it into your newest owa folder, combining it with the existing contents. There were css's in my existing folder, which I chose to keep, but I would assume that the damage would be negligible if one were to decide to "overwrite all" in this scenario.

if you've done this, or confirmed the folders have gif's and jpg's or whatever, in them, then you're dealing with a permissions issue. be sure to give the IIS user account (IUSR_ for IIS6 and IUSR for IIS7) read access to the theme folder, recursive. that should fix the problem of no pictures working, and should at least get OWA lite working.

getting a straight forward answer with all this has been difficult, but when I noticed the subfolders under themes, in the newest version of the owa directories, only contained one css file, and the older version numbers had many more files of varying types (mostly css, gif, etc), I decided to give it a go; and it worked!

Now, if I could only find that guide I had that explained how to use self-signed server certificates for Exchange ActiveSync...

I know it has to be done from the command line, since the GUI doesn't have the necessary options to create a self-signed cert for a dns entry that it does not technically inhabit. To explain, my server is located behind a NAT firewall, with port forwarding for port 443 (SSL). The servers address, according to the server is WIN-*random characters*.domain, where, to the rest of the internet (and any web-accessing Windows Mobile device) will see it as simply the domain name (no computer name). So, the server needs to issue a self-signed cert, that the public key is exportable to the device (for manual addition, RE: Trust), for the root domain.

I'll do the work for it later, when I have time, for now, OWA works, despite the certificate errors.

Tuesday, May 18, 2010

(sometimes not-so-) Smartphones

I started my journey with smartphones about a year back, I was an avid Windows Mobile user and I started with a very simple philosophy. This philosophy was based on the fact that I had, in the past, had experience with Windows Mobile as a platform for PDAs, and experience with the Dell Axim x51v, which was built by HTC.

So my philosophy was, combine HTC, with Windows Mobile (preferably something later than version 6), and you should come out a winner. I was wrong.

I ended up buying the HTC Touch Pro, from Telus Mobility, here in Canada. We don't have very good options for mobile networks here, and I don't want to get into the nuances of why Telus is superior over the other two of the big three (Bell, Telus, and Rogers). Needless to say, it took me a long time to arrive on Telus' door. Now that I'm here, I'm not leaving.

At first, I was really satisfied with the device, I even setup an exchange server for the purposes of sync-ing my contacts, calendar, etc. Though, I didn't have, at the time, a proper DNS or MX setup (nor did I know how at the time), so the system wouldn't RECEIVE any email, however, it worked for what I needed. I had to reset my phone a few times during the past year and the exchange server was instrumental for getting up and running quickly.

There are two major quirks with this phone, from this provider, that I can point to as a primary reason for me wanting to rid myself of it.

First, and primarily, is text messaging. Don't get me wrong, texting on the Touch Pro was a dream, however, the phone would wake up EVERY TIME a text was received. This means that the touch screen is active and there's a lot that can go wrong when you don't hear the text message alert tone and the phone is rattling around in your pocket for a few minutes. I had one instance where I "pocket dialed" someone more than half a dozen times (the same person!)...

I searched without luck to try and figure out how to fix the problem and eventually just lived with it until I got so fed up that I replaced the phone.

I know there are a lot of people who may yell and scream and jump up and down saying that there are apps for that, and ways to lock out the screen so that even if it comes on, it won't end up pocket dialing people. To them, I would say that I have yet to find a solution that isn't completely and totally irritating, every time you turn on your phone, that maintains the overall performance of the phone. There's a built-in screen lock, which requires a code to unlock, not convenient if you just want to quickly look to see who's sent you a text or email; there's also 3rd party applications like S2U2, which I actually tried for a time, but found the overall latency that it added to the phone was more costly than the convenience of having it.

Secondly, there was the phone. As a multimedia, email, texting device, as well as an organizer for contacts, and planning my day, The HTC Touch Pro was excellent in almost every regard, some areas a little less so, but overall, excellent. As a phone however, it just wasn't very good at all. Don't get me wrong, the overall experience, talking on the phone, etc, was adequate, and you could clearly understand the person on the other end and listen to what they had to say, additionally, they could hear you, etc.

That's not what I'm talking about.

After about 10-15 seconds on the phone, the display would turn off. The only way to re-activate it, is to hit the power button... with automated systems that have greetings longer than "hello", by the time you get the opportunity to make a selection, enter your parties extension, or enter your password (in the case of voice mail) the display is off. A minor gripe, I'll admit but still, every time. There's probably a way to change this, but at the time, I was so tired of fighting with the phone to get it to do what I wanted, that I just decided to live with it.

after that, the keypad wasn't always responsive. you could hit three or four keys before it would respond, but it would remember each button pressed on-screen, so you could, for the most part, punch in an extension even when it's not responding, and it would eventually send the tones. unfortunately, they register as very short taps, so the tones are also very short, which, with some systems, cause the numbers not to be recognized; this resulted in having to re-enter the same sequence, more slowly.

Additionally, the phone application, when ending a call, would completely lock up the phone for 10-15 seconds with no discernible reason. The system would spring back to life (all your taps would be recognized at once), and you'd end up back in business. Again, a minor gripe for sure, but should I really have to wait 15 seconds to suspend my phone after every call? This is wasting time in my life, and often I'm only waiting for it to respond long enough to exit from the phone application and shut off the phone so I can go and do something else.

Yesterday, I officially took the plunge; into financial termoil, and bought a new smart(er) phone. Since I'm always eager to try something new, I decided to give a go at an Android phone. I can, and have, used almost every other mobile operating system, and here's my general rundown:

Standard phones: with most 'regular' cellphones, an OS is an OS. the presentation and layout may vary, but the content is pretty much set. If you've seen one T9/"Predictive Text" input, you've generally seen them all. Buttons vary, eg. which button is the space bar and which one selects the next potential word, but beyond that, there isn't much variation. Even with that said, I prefer the OS you'll find on Samsung phones. It's fairly clean and responsive, without a lot of extras, which I like. I'm also a personal fan of Samsung. there isn't much that I'm not a fan of, other than Sony.

Apple and apple related products: I have a relative with an ipod touch, as well as several friends with the same affliction. There's others that I'm sure I know with iphones, but honestly, from what I've seen, other than the ability to call someone, and text, and the freedom to use a 3g network as opposed to being confined to within wifi range, there's not a lot of difference between the OSes. they're neat and organized with little variation. I don't like the interface myself, and with the lack of multitasking on the phone, I would rather not own one. Granted, on a small-format device like a cellphone, you're probably not going to be jumping between apps like you would on a Desktop, but still; knowing my MSN or gtalk is running in the background collecting messages for me, so that I'm available to friends and family for a large percentage of my day, is important to me, and I can't see that being feasible on one of these devices.

They're nice, I'll give them that. They have good hardware, and apple knows what they're doing when it comes to hardware. but the support is lacking.

Blackberries: You've seen one, you've seen them all. I guess that's good in a way. other than the fact that some don't have keyboards, and it's now all done on-screen, there isn't much variation here. A friend of mine bought a blackberry a week or more ago, and I finally got a chance to play with it the other day. I found myself thoroughly bored with the interface, almost searching for something to do on the phone. These are, very much, work devices. if you're not emailing, sending texts, talking on the phone, or using it for something else, likely related to that, you're probably not using it at all. Don't get me wrong, a blackberry has it's place, but if I wanted an overpriced, souped up Nokia 5190, I'd buy one.

Which brings me to my next point,

Symbian: I'm not sure why this deserves it's own catagory, but I'll roll with it. Symbian is a beefed up version of the same, recycled operating system that's been used on Nokia phones since they could program one in. I first experienced the early, almost "beta" version of symbian on the 5190, and again on the 6190, and again on the 7210. When a friend of mine literally destroyed his 7190 out of pure frustration, I decided to move in a different direction. I havn't kept up to date with development of this operating system; I don't think many have. To be honest, it's almost sad to see how much Nokia has fallen. Their phones are usually of fairly good build quality though, so if you're not in the smartphone market, Nokia's are usually solid devices.

As a smartphone though, I wouldn't choose a Nokia over a blackberry. just wouldn't happen.

Saving the best for last, Android.

Google keeps out pacing itself, and I hope this keeps up. Everytime I see some new, supposedly amazing google application or device or something, I remain skeptical; even if I sometimes get in on a beta (eg. google wave). I usually end up plesantly surprised, but I usually keep away from google's products until they've proven themselves in the market. I moved to Google Chrome as my primary browser and have slowly adopted most of google's online offerings through a multitude of gmail accounts. There are some products they've come up with, that are currently US only that I would be eager to try out, despite my skepticism (eg. Google Voice).

With all that being said, Android is a truly unique platform, in the sense that it supports and almost promotes free software distribution, while allowing for the potential of proprietary, 3rd party applications to be installed as well. As an operating system and as a standard, it's truly a beautiful product, regardless of the hardware.

Hardware helps though, and there's a lot to choose from with google's offerings. They've created an almost invincible alliance with HTC, who seemed to be, until recently, lost in a haze of making products for other people (eg, Dell Axims). Now, with HTC at their side, Google seems to be blazing forwards with a new momentum, taking over a vastly expanding market; cellphones. They aren't without competition, and I'd be lying if I said I knew a lot of people with Android phones. But interest remains high, and improvements to existing devices and releases of new ROMS is fairly common, to devices that will support the new features.

After looking over the options, I decided to get a Motorola Milestone from Telus. I chose this option for 3 reasons.

First, I decided on the Milestone, because of the keyboard. Being a long time cellphone keyboard lover (starting with a Nokia 6865? from Rogers), I've long since loved having a keyboard for texting on my cellphone. With the addition of MSN and gtalk, as well as Email and Web browsing, not to mention the social networking applications, etc; a keyboard makes sense.

Next, why the version from Telus? The price of a Milestone from expansys.ca is actually LESS than the base cost of the Milestone handset from Telus. However, Telus was offering me a discount on buying a phone by renewing my contract and agreeing to re-start my 3 years with them... I'm only a year into my contract and it looks like the Touch Pro won't make it 2 more, so I'm okay with that.

Finally, the Milestone is advertised as coming with an 8GB MicroSDHC card. This is for all the various applications and content you'll want to put on your device. This phone is a multimedia powerhouse, with a WXGA display, so 8GB is more like a minimum than anything. With that said, Telus was offering the Milestone with an included 16GB MicroSDHC card. I later discovered this is a Class 2 card, but the performance of my specific card seem to be more around Class 4-6, so I've been fortunate.

Overall, the purchase cost me less, I also ended up grabbing up a power adapter for my car, and an otter case for it, I still have yet to get a holster style case for it, which I've done with every phone I can remember (since the 7210). But this too will come with time.

Initial impressions are good. There's lots of free software in the Android market, and I'm overall pleased with the performance of 2.1. I'm somewhat disappointed at the video format compatibility, even with x264 encoded video, an mkv file will still not play natively on the phone. the x264 codec should be directly compatible with the h.264 decoder built into the phone, however, the media player will not demux the mkv file.

I have a lot of testing and discovery left to do with this phone, and I hope it's a plesant experience overall. So far, so good. And there's a built-in slide-unlocker... even though the phone will remain "off" when getting a text! woo. So far, all my gripes about the Touch Pro are gone, I wonder what other gripes I'll find.

If you're curious and want me to check something, just leave a comment, I'll be happy to try something out on the phone and get back to you. I'll be happy to answer any other questions that may be posted too :)

Saturday, May 15, 2010

Wifi

I know my blog doesn't reach very far yet, and most of the people who have seen it, and will be reading it currently, don't have an extensive technical knowlege, so with that in mind, I move forward.

My blogs will become more technically driven, worded and oriented as time goes on, but if you find that I'm explaining simple technical concepts here, that's because I expect most of my readers to be non-technical people.

I just saw a video on youtube about a man who boosted his wifi signal strength for less than $1, using an old box and some aluminum foil. I'd like to throw support behind the guy, and he's got the concept right, but the implementation is clearly done by someone without technical knowledge of how wireless functions.

for the purposes of this article we'll be talking about the most common wifi, which is 802.11g, 2.4ghz wireless.

Here's a brief rundown: In optimal conditions, a wireless router produces an alternating current through a wire, at 2.4 billion iterations per second, which produces an electromagnetic wave, which travels through space towards its intended target. this wave travels at light speed in a vacuum, but since we don't live in one, it's slowed down by air and whatnot. Regardless, at 2.4Ghz, and the distance that an electromagnetic wave can travel in one second, we can determine the wavelength. The wavelength is how long one hz, or one iteration of the wave occurs (meaning it goes up, comes down, and comes back to center). from all the math, we can determine that a 2.4ghz wave has a wavelength of aproximately 12.5cm (or 4.9something inches). all this aside, the 802.11 protocol has been designed with reflections and noise accounted for, so your computer isn't expecting a clean wifi signal to begin with. with this guy's reflector being... what? 2cm from his antennas, it causes a reflection that is out of phase.

brifely, phase is where in the wave you are, it's measured in degrees, starting at zero degrees from centre, going upwards to the highest level, at 90 degrees, returning to the zero line, at 180 degrees, and the lowest level at 270 degrees, then returning to zero again at 360 (or zero) degrees.

the out of phase signal is accounted for by the wifi card and it interprets it with the rest of the data on the wireless channel, but it makes for an uneven waveform (which actually damages the potential signal quality).

The intention is to put two waves, in phase, and by doing so, the two compliment each other and amplify each other. to do this, the reflector should be one-half wavelength away from the antennas on the opposite side of the antenna related to the desired target (aka behind it).

all this techobabble means that if you want to boost your signal better, you need to have the reflector exactly 6.25cm behind your antennas (for 2.4ghz). obviously, he's far under this, I doubt he's even 1/4 wavelength behind the antennas, which means he's probably giving two waves to the target that are 1/8th wavelength, or 45 degrees out of sync. albeit, it will still be more powerful than without the reflector, but move that reflector to 1/4 wavelength and you would actually have a negative impact on wireless performance.

Optimally, the reflector should be parabolic, or rounded, around the antennas, maintaining a constant distance from the signal origin.

I hope this explains some of the nuances of wifi technology and signal enhancing technologies. his idea is good and it's a very simple and effective way to increase signal strenth (if done right).

I hope you all enjoyed, good luck if you're trying this, and happy wifi-ing.