It is an interesting question. And many of you are wondering what pipe I've partaken of in asking it. How can broadband save the environment and money? I ran across an older Real Estate related article on the demand for broadband broken up by categories of use. And I mean an old one, 2002 I believe it was. In it there was an assertion that 17% of broadband demand was for telework - working from home. See, there are the light bulbs.
Let us just run a few quick, back of the envelope calculations. Assume that 15 of that 17% could/would work from home at least 75% of the time. That would mean that using March of 2005's daily gasoline usage in the US (yes I know about Diesel, but they are fewer. Yes I also know about E85 - much more so than many reading this!) of about 320,500,000 gal/day we could use 24,037,500 less gallons, about 7.5% (assuming I did that right ;) ). Figuring 20 gallons of oil is roughly a barrel (bbl) of oil, that would be a direct reduction of about 1.2 Million bbl/day (again all general BOTE). We import about 10Mbbl/day. Not too shabby.
To put that into perspective, that is roughly half of our Persian Gulf imports.[1]
And this could be done without a single change to fuel economy standards, vehicles, or increased gasoline taxes. But the effects don't end there. If this 7.5% reduction were to occur, we would save more than that 7.5% usage figure I just used.
Here is where it saves local governments money. I'll try to run through this and we can revisit it later. Fewer cars on the road leads to less traffic congestion. Less traffic congestion means lower "demand" for more and larger roadways. It also means less stop and go traffic. Less stop and go traffic means less time spent idling (wasting fuel), less time starting from a dead stop (relatively wasting fuel compared to a smooth flow), and less time on the road running an engine.
All of these result in better fuel efficiency in today's vehicles. Even yesterday's vehicle benefit from this - and do so without costly retrofitting. How much is difficult to say. So I'll draw from some anecdotal evidence from my personal experiences. Yes, I've compared the economy I get in light traffic to that in heavy traffic. I've done this in the Vette and the Suburban. The Vette shows a rough average of about 10-15%. That is not small potatoes here folks. The Suburban shows a difference of roughly the same, leading toward 20%.[2]
So let us figure an additional overall drop of about say 5%, for a total of a 12.5% drop in gasoline usage. That would bring us to about 1.8M Bbl/day less, or about three quarters of our Persian Gulf imports. Is it enough to end any "crisis" mode? Absolutely not. But is it enough to make significant difference? Absolutely.
If the assertions about US "Defense Spending" being heavily influenced by and a form of subsidy to the oil industry are true, then there would be the opportunity for a lower defense budget. But that doesn't really help local governments out, does it? Still, I would not object to lowering operations abroad and subsequently lowering the military budget.
Local governments are assisted by widespread broadband by the fewer cars on the road == less traffic congestion aspect. With fewer cars on the road during peak hours the existing infrastructure would fare much better. This means reduced impact and reduced "need" for widening the roads, "traffic" cameras, etc.. Not an end to end solution by any means. But then again you don't see me pitching this to get mass funding either - an act so typical of the proclaimed 'solutions' to these matters.
But are there enough jobs of the kind that people could "work from home" a few days or more per week for this to "work"? I believe there is more than plenty to do it. You might be surprised at the number of jobs that could be done from home if so desired/allowed.
Tech support from home? Been there, done that. Heck, if we can ship it to other countries, why not ship it into the home instead? For well over a decade we've had the phone technology to do this. Indeed from what I've heard/learned much of the infamous 1-900 phone system operators worked this way. Broadband could take this and extend it. Picture the support person side of it being done over Voice Over IP (VOIP) such as Skype, Vonage, etc.. You the customer place the call, the call is routed from your phone to the support company's network and shipped over broadband to the support person's VOIP phone.
By combining this with VPN technology (Virtual Private Network: it allows your laptop at home to be "on" the company network), tech support or even non-technical customer service support could be distributed. The technology for this has been around for quite a while. This application of it could reduce the size, scope, and city service infrastructure requirements for call centers.
Perhaps instead of an 8,000 square foot facility, a Dell or HP could instead use a 2,000 square foot facility and have their support representatives work from home say 60% of the time. Then by good scheduling they would have less people on site at any given time (training, meetings, supervision of critical accounts, etc.), thus "fitting" into the smaller facility.
With this scenario, HP (for example) would find it easier to get approval from the city/county. The city (as in residents as well as government officials) would have much fewer concerns about the environmental and infrastructure impact of the facility. HP would enjoy lower costs that they could pass on in benefits or higher wages and salaries for the support reps (HAH!).
Some other effects of this scenario is that you have less people wanting to be close to that facility. This could lower density. From what I recall, it is not population totals that are a significant factor in crime rates, but population density. Naturally, reduction in crime rates (or less of an increase) is not only good for the area, but can save the local government money in ways I don't need to describe here. ;)
This reduced traffic goes much further than just teleworkers. Given the apparent assumption by many online shopping sites that we all have broadband, if we actually did we should see more online shopping. More online shopping reduces local traffic. Sure, we've got the delivery trucks to contend with. However, while FedEx trucks would be seen more often on the road, there would still be a significant reduction in the number of vehicles on the road. One FedEx truck making a delivery to 25 houses is certainly less of a traffic congestion contributor than 25 cars all out shopping. Further, for many there is no "one-stop shopping" sites in the world of brick and mortar stores. So that single FedEx truck goes directly where he needs to deliver packages and that's it. So Mom and Dad aren't out hitting five or six stores looking for Junior's presents.
Fewer cars on the roads means less congestion means less expansion pressure, means less road surface to maintain means less infrastructure expenditures. Fewer cars on the road during "rush hours" means less congestion means better fuel economy means less fuel consumed. Less gasoline consumed means less oil used/imported. Less gasoline consumed means less pollutants in the air. Whether Global Warming is man-made or man-influenced or not, less known-hazardous particulates in the air is A Good Thing[tm].
Less pollutants in the air means better air quality for the local residents and visitors. It also means less enforcement costs for temporary measures. For example when The Inversion hits Boise, burn bans go into effect. Hypothetically speaking, if telecommuting and it's myriad of synergistic effects could reduce vehicle emissions onto the Boise air and minimize/eliminate the health risk of The Inversion, the burn ban could be eliminated. Along with this would go away the enforcement costs of it. Not to mention the intense unpopularity of it among wood stove owners.
So why is it so few people have considered broadband as a contributor to lower civilization and environmental costs? Most of the effects are not direct, obvious, and large on their own. Too many people, particularly those in politics and those seeking big fat research grants, are only looking to the "big Hail Mary" play. They seem to forget that by moving at least 10 yards total every three downs, you will reach the end zone. People who want to control your life (so-called "environmentalists". We refer to them around here as Environazis), or preach conservationism (yes, preach is the right word) would rather you drive a car of their choice, would rather you give something up to achieve 'conservation". They, too, forget that incremental but synergistic drops in usage/demand work. As a result they are blind to such changes.
For an example, I'll stray a bit off my topic here for a moment to mention another way to cut fuel consumption in this country. Big Rigs. By raising the Gross Vehicle Weight Rating, trucks can carry more per load. More cargo per load means fewer trips are needed. Fewer trips represent tremendous fuel savings. Michigan did this. They went to a GVWR of 164,500 pounds. For one of their larger (private) trucking fleets, they increased their load by a factor of 2.5 and lowering their fuel costs by the equivalent of going from 5MPG to 12.5MPG[3]. For those concerned about safety and road damage, adding an axle (or two) solves the "damage" problem as it is pressure not weight that causes it. More axle(s) means more brakes means more stopping power. There are more savings to be had from such a change but this isn't the post for that. Let me know if you'd like to know more about it.
But where are the so-called environmentalists proclaiming that to save the world we have to immediately do this across the board? Why is it that an effective 40% reduction in the trucking industry's fuel use does not make them salivate at the idea? Not to mention the 40+% reduction on costs to the transportation industry (and fuel is one of their two largest costs). Why indeed.
How would traffic congestion be affected in your area? Just to get a (very) rough idea the next time you are out in traffic, count off ten cars and imagine two of them no longer on the road. Don't imagine too hard, they are still there after all! How much better would your commute be? How much better would it be if there were less tractor trailers on the road?
Even better: the next time you are walking to your car to go to work imagine not needing to.
1: http://tonto.eia.doe.gov/dnav/pet/pet_move_impcus_a2_nus_ep00_im0_mbbl_m.htm
2: How does a Suburban owner get off talking about fuel economy? Simple. It's E85 powered. If I get 10MPG on E85, I am getting 67MPG of Gasoline (MPGg). Who's burning more oil now?
3: Winning the Oil Endgame published by Rocky Mountain Institute.
07 March 2006
05 March 2006
Search Engines vs. "Content producers"
So, a porn company called "Perfect 10" sued Google for including them in search results. Shocking, isn't it. Detail are given by Marc Gunther at Fortune. In his musings, he seems to think this a good thing. For some reason, people seem to be upset at search engines (and Google being the 800 pound gorilla, it is the target of choice) for making money from "their" content.
To me the first question is "so what"? Why should I care that you make money from my work. It depends on how you do it. If you take my physical property and sell it, you've done me wrong. In this case you have stolen from me. But what about linking to my site, where I put "content"? Does that cause me harm? By itself, no it does not.
After all, Google is essentially advertising for us. Whether or not I even try to make money from my writings, software, etc. is irrelevant. They are not selling my work. I put my work on a publically available space and did so knowing this. consider an alternative: can I prevent you from showing the pictures you took of my car at the autoshow? No. Even if you profit from them? No. The history of automobile magazines will bear this one out.
Specifically, the said company had this problem:
They want to claim that since they sell thumbnails of their images to cell phone users, Google is harming them. By putting their pictures up where anyone can get them, they harm themselves - assuming there is harm. Their lawyer makes a comment about selling stuff that is free. Yes, selling pictures I can get for free is a problem for your business model. So don't put them where I can snag them myself. The fault is theirs and Google, Yahoo, etc. should not have to pay for it.
In another case Marc mentions:
Indeed. Yet again, I take note that AFP put these headlines out for the world to see. Again the problem/mistake is on AFP's side, not Google's. The further question here is: Why should I pay for AFP's "headline service" If I can write a quick script that will extract these headlines from their website and send them to my phone w/o paying AFP? Is that legal? You bet it is. Is it immoral? Nope. Again, they posted this information, in full knowledge and intent, on a publically available website.
Here is another one:
In closing, Marc comments:
I couldn't disagree more. First, he has a problem with what "the Internet is doing to print media". The Internet is "doing" nothing to print media. At least, nothing the printing press didn't do to monks and oral traditions. The Internet is a rapid response and rapid content delivery mechanism. Print is not. From the seeming aeons it takes to layout and publish a magazine or newspaper in just the right way to hack me off that I have to jump around to follow a story to the interminable hand wringing over what to run when, Print Media is doomed with or without Google, Yahoo, and anyone other search engine.
If anything, these companies are doing a massive disfavor to media of any kind. Should Marc ever read this blog, I'd hope he considers the fact that I've provided some minimal amount of exposure to him and his work. Exposure he admits to paying other people for.
Google obeys robots.txt files. These are files a competent webmaster uses to prevent certain pages from being indexed by automated machines. Clearly, not even this minor level of effort is acceptable to these companies. Absurd, IMO.
Essentially what is going on is that some people figured out how to make money using the Internet, and some have not. Those who have not are pissed off and suing those who have, a time honored tradition in mankind's history. For years so-called "content producers" have been complaining that the entrenched distribution industry gave them a royal shaft. They couldn't afford to go it on their own, they claimed, because "marketing" was too costly if nothing else. So along comes a means of distribution that is relatively easy and cheap (digital internet distribution), and it is praised as a godsend for "the little guy".
Then to make it better, along come big search engines that make it easier for said "little guy" to market his content, to get noticed, to reach his market. And what happens? People get hacked off. But notice who is suing.
Is it the photographer? No. Is it the author? No. it is smallish publishers making stupid decisions. They blame Google, but where is the underlying "problem" they face? We don't need them as much anymore. And they are beside themselves with apoplexy. To his credit, Marc isn't one of them, he only wonders and has some concerns. Fair enough. But notice again, he isn't the publisher. He is an author.
To me the first question is "so what"? Why should I care that you make money from my work. It depends on how you do it. If you take my physical property and sell it, you've done me wrong. In this case you have stolen from me. But what about linking to my site, where I put "content"? Does that cause me harm? By itself, no it does not.
After all, Google is essentially advertising for us. Whether or not I even try to make money from my writings, software, etc. is irrelevant. They are not selling my work. I put my work on a publically available space and did so knowing this. consider an alternative: can I prevent you from showing the pictures you took of my car at the autoshow? No. Even if you profit from them? No. The history of automobile magazines will bear this one out.
Specifically, the said company had this problem:
Perfect 10's lawyers argued that the thumbnails, which it notes are quite a bit larger than the average thumbnail, have value to the magazine because it sells small images to a British cell phone company.Did they put the pictures on the Internet where anyone can see them? Given that Google does not search protected sites, I would say they had to have, unless Google indexed the content from some else who would have been the infringer. Putting content and pictures on the web w/o protection (such as a password) is very much like driving your car on the street versus driving your prototype on private property behind walls to prevent people from seeing it.
They want to claim that since they sell thumbnails of their images to cell phone users, Google is harming them. By putting their pictures up where anyone can get them, they harm themselves - assuming there is harm. Their lawyer makes a comment about selling stuff that is free. Yes, selling pictures I can get for free is a problem for your business model. So don't put them where I can snag them myself. The fault is theirs and Google, Yahoo, etc. should not have to pay for it.
In another case Marc mentions:
Agence France Presse (AFP) is charging that Google News infringes on its copyright by displaying headlines, thumbnails and story leads without permission.So again we have a "content producer" (actually a mere publisher, but I'll leave that nit alone for now) complaining that Google made it easier for me to find them. Going further, their lawyer states:
"If people can just take your headlines, who needs to subscribe to AFP's headline service?,"
Indeed. Yet again, I take note that AFP put these headlines out for the world to see. Again the problem/mistake is on AFP's side, not Google's. The further question here is: Why should I pay for AFP's "headline service" If I can write a quick script that will extract these headlines from their website and send them to my phone w/o paying AFP? Is that legal? You bet it is. Is it immoral? Nope. Again, they posted this information, in full knowledge and intent, on a publically available website.
Here is another one:
Authors and publishers object to a Google venture called Book Search, an ambitious attempt to scan the contents of several university libraries and make them available for search. The search results will display only a few lines of copyrighted works, which Google says is fair use. But publishers and authors say that Google has no right to scan entire works without their permission, even if only portions are displayed.In particular note the emphasis I added. Oh really? Since when does someone who acquired the book legally have any limitations on how they read it? Google is displaying text well within the fair use clause, and even scanning the book is fair use. I have considered scanning many of my books, but haven't had the heart to cut their spines open to do it. Why? Backup and availability for one. If I had a private URL in which I could access my books from anywhere with a net connection, I'd be a much happier man. Further note that authors can opt out of this free advertising service google is providing.
In closing, Marc comments:
But I make a living by writing, and it's plain to see what the Internet is doing to print media. Google News is a computer program. Real news gathering requires reporters and editors. The guys behind the Perfect 10 lawsuit may be doing the other media companies a favor.
I couldn't disagree more. First, he has a problem with what "the Internet is doing to print media". The Internet is "doing" nothing to print media. At least, nothing the printing press didn't do to monks and oral traditions. The Internet is a rapid response and rapid content delivery mechanism. Print is not. From the seeming aeons it takes to layout and publish a magazine or newspaper in just the right way to hack me off that I have to jump around to follow a story to the interminable hand wringing over what to run when, Print Media is doomed with or without Google, Yahoo, and anyone other search engine.
If anything, these companies are doing a massive disfavor to media of any kind. Should Marc ever read this blog, I'd hope he considers the fact that I've provided some minimal amount of exposure to him and his work. Exposure he admits to paying other people for.
Google obeys robots.txt files. These are files a competent webmaster uses to prevent certain pages from being indexed by automated machines. Clearly, not even this minor level of effort is acceptable to these companies. Absurd, IMO.
Essentially what is going on is that some people figured out how to make money using the Internet, and some have not. Those who have not are pissed off and suing those who have, a time honored tradition in mankind's history. For years so-called "content producers" have been complaining that the entrenched distribution industry gave them a royal shaft. They couldn't afford to go it on their own, they claimed, because "marketing" was too costly if nothing else. So along comes a means of distribution that is relatively easy and cheap (digital internet distribution), and it is praised as a godsend for "the little guy".
Then to make it better, along come big search engines that make it easier for said "little guy" to market his content, to get noticed, to reach his market. And what happens? People get hacked off. But notice who is suing.
Is it the photographer? No. Is it the author? No. it is smallish publishers making stupid decisions. They blame Google, but where is the underlying "problem" they face? We don't need them as much anymore. And they are beside themselves with apoplexy. To his credit, Marc isn't one of them, he only wonders and has some concerns. Fair enough. But notice again, he isn't the publisher. He is an author.
Tags: Intellectual Property
Tags: Google
02 March 2006
Thoughts on MTA throughput Part 1
A short while ago I was on a conference call with a certain MTA vendor who shall remain nameless (yes that means I am not terribly impressed. Nor even slightly, if at all.). They claimed their server was "500 times faster than Postfix" in "relay only mode". They also claimed to have a throughput of "two million messages per hour". On the same OS/Hardware.
Needless to say, as a Postfix admin I was stunned. I was not stunned that anything could be faster. After all I could write a simple relay only MTA that outperforms Postfix. What stuns me is the simple math in the above statement. Combine the claim of "two million messages/hour" with "500 times faster than Postfix" and you can see the source of my astonishment. They have been making this claim for years.
For those who did not just whip out a calculator or do some simple mental math, that means they believe Postfix is only capable of about 4,000 messages per hour. *sniff-sniff*. Yeah, I smell it too.
First, let us consider some organization we know that use Postfix, and estimate whether or not the claim of Postfix only handling 4,000 Messages/Hour bears any validity.
Postini. Now here is a company that handles a lot of email. Their servers are Postfix servers. I do not have actual numbers of course, but they claim on their main page to handle a billion messages per day. Something tells me they are not running a over 10,000 Postfix servers. Bear in mind as well that Postini does spam, virus, and other types of validation as well. As anyone who has been in the trenches of the spam and trojan/virus battefield will tell you, more checking
means less throughout/performance.
One of the main contributors to Postfix (code and book) works for one of the larger financial services companies in the US. They use it. Suffice it to say that Postfix is demonstrably fast and capable. Oh, did I mention said vendor's MTA is written in .... JAVA?
I've personally witnessed Postfix servers processing over a million messages with a load of email checks involved. On much lesser hardware. With a full development environment and compiles running during the testing.
Am I surprised at the claim? Other than the sheer outrageous number of it, no. Marketing will do anything to bag a big-name client. Am I irritated? Yes. Yes I am. I am irritated in part because I am a Postfix guy. But what irritates me more is the complete lack of documentation and details on MTA performance testing.
What exactly is meant by "2,000,000" messages per hour? Some would say the number of emails going through the server in an hour. That is a deceptively simple and naive definition. You see, there is as much difference in real performance as there is between a Chevy Suburban with a 350 HP Motor and a Chevy Corvette with a 350HP motor.
Consider the type of emails, the details if you will. What size messages? How many recipients? How much concurrency? How many connections? What was the sustained versus burst? How long were the messages on the system before being relayed out? Over what type of network? Was connection caching used? How about DNS setup? Recipient verification?
I'll start with a few of the simple but major ones above and save the rest for later.
Recipients
How many recipients per message? This detail is one of the more important ones. A single message sent to 200 recipients is a far different scenario to handle than 200 messages sent to a single recipient each. The difference is enormous.
Consider some basic anti-spam weaponry: recipient verification, subject, and body checking. If I send a single atomic message with 200 recipients, my server system will perform 200 recipient validations, one subject check, and one body check. Or if I have multiple checks it will be M*1, where M is the number of checks on the subject or header.
On the other hand, if I am to process 200 messages, one to each recipient, I have a situation where I am doing 200 recipient checks (no change), 200 subject (M*N where M=checks and N=messages) and 200 (M*N again) body checks. All for the same message.[1]
Which one will perform better? Which one will see a larger performance drop when filtering is enabled versus "relay mode"?
On top of that, there is the issue of connections made. One message to two hundred recipients is a single connection to the server. Two hundred identical messages to a single recipient is two hundred connections. Any server admin worth her salt will tell you the difference betwtixt the two is more than a little significant. In general and w/o regard to any verification/filtering as discussed above the single instance, multiple recipient will have higher "throughput".
More later ...
Footnotes:
1. In theory I could make a hash (md5sum for example) of the message body and first check to see if I have already checked this message before. This would save me from repeating all those checks but it adds the overhead of generating, storing, retrieving and checking for the hash of the message for each message.
Needless to say, as a Postfix admin I was stunned. I was not stunned that anything could be faster. After all I could write a simple relay only MTA that outperforms Postfix. What stuns me is the simple math in the above statement. Combine the claim of "two million messages/hour" with "500 times faster than Postfix" and you can see the source of my astonishment. They have been making this claim for years.
For those who did not just whip out a calculator or do some simple mental math, that means they believe Postfix is only capable of about 4,000 messages per hour. *sniff-sniff*. Yeah, I smell it too.
First, let us consider some organization we know that use Postfix, and estimate whether or not the claim of Postfix only handling 4,000 Messages/Hour bears any validity.
Postini. Now here is a company that handles a lot of email. Their servers are Postfix servers. I do not have actual numbers of course, but they claim on their main page to handle a billion messages per day. Something tells me they are not running a over 10,000 Postfix servers. Bear in mind as well that Postini does spam, virus, and other types of validation as well. As anyone who has been in the trenches of the spam and trojan/virus battefield will tell you, more checking
means less throughout/performance.
One of the main contributors to Postfix (code and book) works for one of the larger financial services companies in the US. They use it. Suffice it to say that Postfix is demonstrably fast and capable. Oh, did I mention said vendor's MTA is written in .... JAVA?
I've personally witnessed Postfix servers processing over a million messages with a load of email checks involved. On much lesser hardware. With a full development environment and compiles running during the testing.
Am I surprised at the claim? Other than the sheer outrageous number of it, no. Marketing will do anything to bag a big-name client. Am I irritated? Yes. Yes I am. I am irritated in part because I am a Postfix guy. But what irritates me more is the complete lack of documentation and details on MTA performance testing.
What exactly is meant by "2,000,000" messages per hour? Some would say the number of emails going through the server in an hour. That is a deceptively simple and naive definition. You see, there is as much difference in real performance as there is between a Chevy Suburban with a 350 HP Motor and a Chevy Corvette with a 350HP motor.
Consider the type of emails, the details if you will. What size messages? How many recipients? How much concurrency? How many connections? What was the sustained versus burst? How long were the messages on the system before being relayed out? Over what type of network? Was connection caching used? How about DNS setup? Recipient verification?
I'll start with a few of the simple but major ones above and save the rest for later.
Recipients
How many recipients per message? This detail is one of the more important ones. A single message sent to 200 recipients is a far different scenario to handle than 200 messages sent to a single recipient each. The difference is enormous.
Consider some basic anti-spam weaponry: recipient verification, subject, and body checking. If I send a single atomic message with 200 recipients, my server system will perform 200 recipient validations, one subject check, and one body check. Or if I have multiple checks it will be M*1, where M is the number of checks on the subject or header.
On the other hand, if I am to process 200 messages, one to each recipient, I have a situation where I am doing 200 recipient checks (no change), 200 subject (M*N where M=checks and N=messages) and 200 (M*N again) body checks. All for the same message.[1]
Which one will perform better? Which one will see a larger performance drop when filtering is enabled versus "relay mode"?
On top of that, there is the issue of connections made. One message to two hundred recipients is a single connection to the server. Two hundred identical messages to a single recipient is two hundred connections. Any server admin worth her salt will tell you the difference betwtixt the two is more than a little significant. In general and w/o regard to any verification/filtering as discussed above the single instance, multiple recipient will have higher "throughput".
More later ...
Footnotes:
1. In theory I could make a hash (md5sum for example) of the message body and first check to see if I have already checked this message before. This would save me from repeating all those checks but it adds the overhead of generating, storing, retrieving and checking for the hash of the message for each message.
01 March 2006
Minor Rant: Time Wasting Image Links
Most of us have encountered it on the web. You know it when it happens, and you dread it happening as your mouse hovers above the image to click upon. Some might call it a pet peeve. However, for me it isn't to that level. It is certainly quite an irritant to say the least (the very least). What is it that clicking that link drives us to shout at the monitor?
It is the dreaded "(click for larger image)" link. It seems that many web publishers/authors have no idea that when they claim such a result from your precious click time we have an expectation of delivery. Namely, we want a larger image.
Not just a slightly larger image mind you (such as seen here: http://nostarch.oreilly.com/catalog/159327064X/ ). We expect an image that is large enough to provide a good view of whatever it is we are looking at. If it will not help me see the picture better do not put the link there!.
I would much rather be stuck with a small embedded image (thumbnail) than to click a link that opens a full new browser window (or tab, if that is your inclination) only to see the exact same image on it's own, or worse yet ... a smaller image.
It is that feeling of betrayed frustration, not to mention the time it takes to close the useless window my system took the time to render. Betrayal? Absolutely. When this happens my expectations were betrayed. And it makes me frustrated. At least once the anger dies down.
Honestly people, we are all better off w/o such less than useless links being there. Just say no to them.
It is the dreaded "(click for larger image)" link. It seems that many web publishers/authors have no idea that when they claim such a result from your precious click time we have an expectation of delivery. Namely, we want a larger image.
Not just a slightly larger image mind you (such as seen here: http://nostarch.oreilly.com/catalog/159327064X/ ). We expect an image that is large enough to provide a good view of whatever it is we are looking at. If it will not help me see the picture better do not put the link there!.
I would much rather be stuck with a small embedded image (thumbnail) than to click a link that opens a full new browser window (or tab, if that is your inclination) only to see the exact same image on it's own, or worse yet ... a smaller image.
It is that feeling of betrayed frustration, not to mention the time it takes to close the useless window my system took the time to render. Betrayal? Absolutely. When this happens my expectations were betrayed. And it makes me frustrated. At least once the anger dies down.
Honestly people, we are all better off w/o such less than useless links being there. Just say no to them.
Subscribe to:
Posts (Atom)