Saturday, August 25, 2012

The Geography of Apple v. Samsung

I'm writing my post from my "13-inch Late 2007 MacBook" while I follow updates regarding Tropical Storm Isaac on my iPad 2. I have my iPhone 3GS within arm's reach as my iPhone is the only phone I own. While appreciate Apple's devices, I am not an Apple fanboy. I cut my proverbial teeth on MS-DOS and PC-DOS, have experience in Sun Microsystem's Solaris (both Unix and x86 flavors), in IBM's AIX (another Unix flavor), and have used a variety of Linux versions. However, due in part to wide-spread adoption of Microsoft programing and operating system environments, I have some experience with every MS OS since Windows 3.

Apple's victory over Samsung will have some interesting ramifications for the technology industry, some good, some not so good, and some bad.

Of some note is the disparity in international legal decisions with respect to Apple's legal claims as pertains to their patents. Most Americans may not be aware Apple has sued Samsung not only in the United States, but in Germany, in Italy, in the UK, in South Korea, in France, in the Netherlands, and in Australia.

Apple won an injunction in Germany against Samsung and the Galaxy Tab 10.1. However, Apple itself is being sued by Motorola over push-notification technology in Germany. Motorola developed a push-notification technology for pagers upon which current push-notification technology is based. Sort of.

In Europe, and in South Korea, Apple has not made much headway in patent battles. The reason behind their inability to find favor abroad seems to be related to courts abroad being immune to notions that rounded corners should be a patentable feature.

Abroad, patenting environments are considerably different than in the United States. In the United States, an idea sketched on a napkin has the potential of being patented. A square or rectangle with arrows denoting various features has the potential of being enough to capture a patent license. Writing a bit of software code has the potential of being patented.

The idea behind patenting is to protect a real and useful "concrete" technology from being stolen by others and to provide the developer time enough to recover costs and make a little money. The idea must be able to be engineered and  implemented. The idea must represent a new and unique advancement of the natural sciences. In other words, mathematical formulas, business processes, the presentation of information, and software cannot be patented. Yet, another way, you cannot hold a formula or process in your hand; you can hold the tangible outcome of the utilization of a process in your hand, but not the process itself.

Samsung v Apple
Are consumers really so dumb as to think the Samsung is literally a generic iPhone?


Can a shape be patentable? Should a shape be patentable? Should a gesture be patentable?

My response to the above is, no, shapes and gestures are "open source." In other words, they are not unique and represent no form of technological enhancement or contribution. In fact, language is "open source," too.

If language were not "open source" imagine the complexity of paying people who license every word imaginable and also licensed every permutation of every possible letter combination which might result in a new word. Every author would become instantly bankrupt.

Yet, in the United States, that seems to be where we are heading. The US Patent Office allows companies to patent their software code. Allowing patenting of software code is much like allowing Stephen King to patent all the words in his books so no one else could use those books. What S King does is copy-protect his stories, and to some extent he is offered protection from other writers deriving verbatim his works. Musicians cannot copyright the notes they use, but they can copyright the composition and lyrics.

In Europe and abroad, computer languages are seen much like spoken or written languages. Companies are limited as to what they can protect with software patents. Code which accesses a DVD drive, or a software port, or a mouse, or calculates the volume of a sphere - I'm sure there are better examples cannot be patented simply because of the finite number of ways such events can occur. To sue someone because of button placement, button appearance, or button behavior is simply asinine in most European countries because these do not meet the criteria of being new and unique advancements.

In the United States, though, we seem to be heading down the path where even small snippets of code are patentable.

Take the Oracle v. Google lawsuit related to Java. Google was able to eke out a winning decision v. Oracle. In Europe, a similar lawsuit would doubtfully succeeded, as well. The judge did say the judgment did not apply to all APIs, however. The European High Court ruled against the US company SAS, stating "a computer language or the functionality of a computer program cannot be copyrighted."

The theme behind my comments reflects the changes the US patent system must implement to ensure and promote innovation and squelch those who simply want to be paid for property which is falsely labeled as "intellectual." Rounding edges, pinching screens, and vague descriptions of techniques for communicating using copper wire or via wireless communication networks need to be filtered before entering the patent system.
"Oh, a smoothed rectangle works well in pockets, briefcases, and backpacks. Now, let's build a brilliant phone"

One comment I read this morning hinted the decision might be a good one for the industry. The judgment against Samsung will foster MORE innovation, not less, the contributor argued, because people will have to be more creative to avoid Apple's design elements. Companies need to stop jumping on the bandwagon, for sure. Samsung is a global telecommunications company and has been a dominant player in the smartphone market. True, Samsung made some very bad choices which essentially gave the jury no choice but to find in Apple's favor. However, if Samsung had simply said, "Oh, a smoothed rectangle works well in pockets, briefcases, and backpacks. Now, let's build a brilliant phone" they probably would have been OK. Instead, they opted to nearly reverse engineer an iPhone, which is really dumb, and now Samsung's own stupidity has brought a queasiness sense of dread to an entire technology sector, putting everyone on edge. Hopefully, though, the verdict will act as a cold splash of Reality in the faces of technology design teams around the world. "Oh, shit, we've been mesmerized by Apple. Duh. OK, everyone lets get creative."

As a direct result of the decision, lawyers will undoubtedly play an even bigger role in technology design simply to prevent further patent legal issues. Is this really good for innovation, to have more legalese involved simply to make sure Technology A does not "steal" from Technology B? Do we really want lawyers designing our technology?
"No, the radius of the curve is 0.57. The i6 has a curve radius of 0.54. That is only an 8% difference; that will never hold up in court. Your going to have to use a chamfered corner, instead.

No, you cannot use rounded icons. Let me consult the list of open source shapes, and those shape which have yet to be patented. Hmm, well, octogons are out. The octogon shape has been patented by the UFC. Uhm, no pentagons, either. Homeland Security doesn't want terrorists to be constantly reminded of what the Department of Defense building looks like. No triangles, either; GLBT have patented all triangle variants and most shades of pink and purple. Well, I'm going to have to get back to you on this."

All of this might seem silly, yet in light of swimmer Ryan Lochte wanting to patent the word "Jeah!" my comments seem pretty realistic.

Patents, in a sense, create a monopoly for the owner. In a corporate sense, hegemony, which is worse. Hegemony means complete ownership and domination. An examination of Apple's suits worldwide, as I discussed earlier, almost argues Apple is looking not simply for protection of intellectual property but complete hegemony over the global telecommunications markets, using local, regional, national, and international court systems.

I realize the jury made their judgment based on current US legal doctrine. The judgment might be "lawful" but was their judgment "prudent?" Recently, the Norwegian mass murderer Anders Brievik was sentenced to 21 years in prison for his crimes, the maximum legal judgement in Norway. In the United States, with 100% surety Breivik would have spent his remaining days in prison, or been executed. In Norway, Breivik's sentence was "lawful" in that he was punished to the maximum Norwegian courts allow, but was his punishment in proportion to the deaths of 77 people, most of them schoolchildren? That is my point. Sometimes, what is lawful is not necessarily proportional to the crime nor prudent.
Sidenote: Breivik will probably never again see the light of day. After commenting in court he didn't kill enough people, the judges will most likely extend his sentence. In Norway, while the maximum sentence is 21 years, options exist for extending prison terms by 5-year increments.

Patents were designed to be granted for tangible objects which represent new and unique technologies developed from knowledge of the natural sciences. A CD, or DVD, or Blu-Ray, or SD card reflect patentable materials. However, patent laws in effect today have not changed much to account for technology, innovation, or rapidly changing global economies. Pharmaceuticals are in the same IP/Patent boat which affects the prices we pay for medication, potentially keeping them artificially higher than they should be.

The patent process needs to be fully examined to ensure fair and equitable standards are implemented which work to the benefit of designers, developers, and consumers.

Monday, August 13, 2012

Are You An Advanced Learner?

The other day I was reading an article, waiting for Curiousity to set-down on Mars. The author described two types of learners. Not there are only two types, but the author had identified two specific types of learners, "Novice" and "Advanced."

Suppose the nature of a project is to build the 2nd Generation Mars Rover. Engineers, geologists, chemists, physicists, whoever, gather to design and discuss the component details. Over the following 6 years the "iRover 2" evolves. Prototypes are built. Components designed, engineered, and fabricated. Software written. Problems and dilemmas addressed and solved.

August of 2019 sees three new rovers rolling over the Martian terrain capturing data in unprecedented detail. New cameras, new optics, new compression algorithms, new and robust materials reduce weight yet strengthen these new Rovers, Larry, Moe, and Curly. Larry, Moe, and Curly are semi-intelligent (like their namesakes) but unlike their namesakes, using the latest in state-of-the-art GPS and LiDAR for land navigation and route-finding, augmented by a flock of Albatross-class Mars drones capable of flying aloft for years collecting atmospheric aerosols and moderate-resolution LiDAR data.

NASA in 2019, as in 2012, finds unparalleled success in these endeavors. Costs have been minimized while the results are simply phenomenal. The scientific community revels in the unprecedented success of the scale and complexity of the Martian ecosystem of robotic vehicles charting our neighboring planet.

Yet, some are not enthusiastic. Engineers, chemists, computer scientists speak out against any more investment in Mars.
What did we really achieve? they ask.
Too much money was spent on these sexy missions with no real tangible benefits.
Yes, I helped write code; yes, I helped design and fabricate materials; yes, some new alloys and processes were used to improve strength and radiation shielding.
Could not the funds have been better spent on something else?
Those are some paraphrased responses I either read or heard in subsequent to the successful Curiosity landing. Those also tend to exemplify the voices of those who fall into the "Novice" learner category.


One of the first color images sent by Curiosity. Morse code in tread is clearly seen


Enthusiastic folks intoxicated by the Rube Goldberg inspired deployment of Curiosity see ... the future. By the future, I mean "possibilities" and "potential" for growth in all sorts of new and unique fields. The commercial and industrial applications of Science, Engineering, Physics, and Chemistry fill adult minds with child-like fervor.

Every molecule of sweat equity poured into Curiosity translates lessons learned.

How much did we, and be we I mean Humanity in the form of NASA employees, engineers and the lot who all chipped into to make Curiosity a success, learn from our experience? How much will we learn? Whatever we learn from Curiosity may pale in comparison to what has already been achieved in simply getting a 1,500lb remote-controlled vehicle across 535 million miles of space and safely down and operational on another planet.

Advanced learners do not only see the problem, or task, at hand. Advanced learners see the task as an opportunity to learn something about the process. Even though Curiosity landed August 2012, the technology built into Curiosity was benchmarked to 2004. Like the Space Shuttles. I had more computing in my personal computer than at least two Space Shuttles. Perhaps all of the them.

But, Advanced Learners are not merely concerned with the nature of the task at hand. No, the entire process becomes an ecosystem for learning. Organizing teams, leadership, team management, the "human-side" of technology is only one part of the system. The technology, the engineering, and all of the other sciences come into play. What did we learn about what worked? Why did it work, and work so well? What did we learn about what broke? Why did it break? Now, and this is the key point, how can we take the sum total of our experience, both the tangibles and intangibles, and apply our ecosystem of knowledge to new and different problems?

But, to be an Advanced Learner does not mean putting a remote-controlled laser-arm buggy on a distance planet. To be an Advanced Learner one only needs to look around, really.
Curiosity is not about exploring Mars

What do tweets about beer and church have to do with anything?


Here is an article from Floating Sheep about a group of college students who mapped tweets. What has mapping tweets to do with anything other than tweets, beer, and church?

Floating Sheep are a team of geographers. One of the team corralled some students and they all learned how to use the Twitter API and pull out geo-located tweets. The tweets of interest fell into two specific groups, those which dealt with "beer," and those which dealt with "church." A marvelous idea on the surface and as the article suggests a great way to introduce students to spatial statistics and research.

As I pitched this idea around, I discovered many folks who didn't see the need to map both "beer" and "church." Almost to a person, no one really saw the need to map beer and church and the inherent spatial auto-correlation between the two.

The mapping project was not about "beer" and "church" just like putting Curiosity on Mars was not really about putting Curiosity on Mars. These are not simply Proofs of Concepts, these are Proofs of Fruition, the fruition of hopes, dreams, and ideas made manifest. The maps of "beer" and "church" could have easily have been about "beer" and "DUIs" or "tequila" and "pregnancy" (I'd like to see that spatial auto-correlation). On a more serious note, the maps could have been about "weapons" and "dark knight" or "shoot" and "Obama."

Learning how to map the relationship between beer and church is not about mapping the relationship between beer and church. That is a side-effect, a by-product.

Curiosity is not about exploring Mars; that is a by-product. Everything which went into developing Curiosity was the real reward. The International Space Station is not about the International Space Station.

All of these endeavors are about putting our minds to work in uniquely challenging ways and expanding our potential to continue putting our minds to work in uniquely challenging ways.

Saturday, August 11, 2012

What Is Facebook Bankruptcy?

In 2003 or 2004, I can't remember which year, I signed up for Facebook. I found people who I remembered from my past, people I knew at that time, and people who shared similar interests of mine.

Facebook allowed me, like everyone, to share sites, information, jokes, etc. Keeping up with the lives of friends, family, acquaintances, and news from groups was a cinch.

The more people I added the more dis-enthused I became with Facebook. See, what happens when people share the mundane details of their lives is you begin to see how mundane people's lives are. Mundane is not a bad thing; perhaps mundane is not precisely what I mean. People are so caught up in going about their lives, grocery shopping, feeding kids, watching movies, etc. Their lives are so busy simply being led that being a well-informed critical-thinking person is nearly impossible.

And, being a well-informed critically thinking person take work. I know; I work hard at being both every day, and critical thinking does not come naturally to me, for sure.

I found the vast majority of people on my Facebook lacked information, were not critical thinkers, and were not really interested in overcoming ignorance or developing critical-thinking skills. Sad. The vast amount of dogma, their entrenched beliefs, and unwillingness to consider alternatives I found completely disheartening.

Then, to make matters worse, my contacts on Facebook had friends who were racists, bigots, and Christian fundamentalists. If there is one thing I cannot tolerate, its intolerance. My brain simply refuses to cooperate and communicate with people who cannot question and contemplate alternatives.

The defining straw was a man who declared himself to be a Vietnam Vet, a Green Beret, and a local gun-toting Baptist preacher. He essentially wants a white Christian America, with no Blacks, no other minorities, no foreigners, no immigrants, simply a White-bred America. Furthermore, no American should ever be taught another language, not Spanish, or French (oh, God, no), and certainly not Arabic. He proceeded to insult and berate a Muslim friend of mine, and was joined by other like-minded people in his verbal attack. I enjoined the conversation, but discovered we were not debating, and there was no chance of me even wedging an alternative thought into his tiny brain. And by tiny, I really do mean tiny, and probably damaged to be as hostile to other cultures as this purported Christian minister.

When I contemplated the shear volume of cultural ignorance I discovered on Facebook, and not just ignorance, but racism, and bigotry, and the overwhelming banality of Facebook, I resigned.

I resigned two years and I really haven't looked back.
I declared Facebook bankruptcy.

Facebook is a graffiti-scrawled toilet stall in the Internet bathroom. My theory of cultural evolution states "society can claim to be only as advanced as the care given to its toilets." The reason Facebook's shares have fallen off might be explained by the fact no one wants to hang-out in a bathroom for very long. Gossip is spread in bathrooms, germs are spread in bathrooms. Who wants to buy advertising in a decrepit bathroom?

I doubt I will sign rejoin Facebook. I don't want to jump into an environment which is mostly toxic, full of poisonous ideas, and rampant with ignorance. I really don't want any more evidence which supports the notion 99% of people are mostly concerned with living their lives without encouraging others to be more adaptive and receptive to new knowledge.
"society can claim to be only as advanced as the care given to its toilets."

I cannot handle the blind and raging ignorance of Facebook. I thought, for about two seconds, of selectively "unfriending" people on Facebook. However, people go bonkers when "unfriended." When unfriended on Facebook people take "unfriending" far too seriously and tend to make the connection to Real Life. Again, proof of the banality and insipid nature of Facebook.

And, I have Twitter. When I declared Facebook bankruptcy, I invested in Twitter. Twitter is superior to Facebook in that I do not really know who I follow, and I do not really know who follows me. I do care, though. But, I also realize people may follow or not as they wish, like moving through a crowded convention eavesdropping on conversations. No one's feelings get hurt if a person "unfollows," at least that is the rule.

I only follow people I think I can learn something from or who have something interesting to contribute to my Body of Knowledge. I use Twitter to follow scientists, astronomers, engineers, programmers, and a bunch of educators. I follow people in Geography, Archaeology, Astronomy, Computer Science, plus an array of science organizations, NASA, NOAA, Association of American Geographers, CERN, and a host of news agencies, and, best of all, creative writers and thinkers, like Neil Gaiman, Tony Lee, David Brin, Cory Doctorow, Liana Brooks, et. al.
I only follow people I think I can learn something from or who have something interesting to contribute to my Body of Knowledge.

Yes, I could do the same with Facebook. Yes, I have lost out on some contests, and my Farmville farm has been foreclosed on. But, I don't have to filter content, don't have to ignore contacts, or choose to "confirm" or "deny." I can follow or unfollow at will. And, I don't have the distractions of Farmville, or Petville, or Fishville, or Zombieville, or Ignorant-Pinhead-Racistville to distract me from learning and interacting with others from whom I can learn more. And, see, if I wanted to find junk on Twitter I could do so. I know I could find all the hate, bigotry, racism, and ignorance I could possibly stand, and then some. But, I don't go looking for it. And, while sometimes those elements do creep in to my Twitter feed, they are broadcast by unknowns, not by people who are supposed to be respectable "friends." Or, these elements retweeted by similar-minded people using such tweets as examples of ignorance in our society. I go looking for good people, good thinkers, good ideas, good technology, good organizations.

If I could figure out a way to do start-over from scratch, with a brand-new account on Facebook, with my true name, not an alias or Nom de Facebook, and "follow" people on Facebook from whom I could learn and be challenged, I might do that. If Facebook would allow a person to "reset" their account and all credentials back to zero, I might contemplate such. Maybe. I think that would be like giving the truck-stop toilet a good cleansing, though, knowing full well in a week or two its going to get filthy again.

If you are tired of Facebook drama, maybe Facebook bankruptcy is for you.

Friday, August 10, 2012

Geography is a Nerd's Paradise

The last few days I have spent working on an older model Dell PowerEdge 2950 rack-mounted server. Last week, I got the wild hair I wanted to upgrade my level of support for GIS on campus. To improve my support and promote GIS on campus, I decided implementing an intranet web site would satisfy both me and my end-users.

I remember setting up and running Apache back in the mid-90s. In order to provide some rudimentary map services, Tomcat was added to the Apache server to handle map requests. Looking back on those days I feel some kinship to the early days of surgery when the patient was given shots of whiskey and something to bite on in order to set a broken bone. Neither Windows nor Unix were completely happy running web sites, though Unix-hosted web servers seemed more amenable to the task.

My friends and I found early versions of Linux easier to deal with than Windows equivalent. But, my statement also comes from a member of a small group who found installing multiple simultaneous operating systems on a single computer a fun way to spend a Friday night. Using utilities such as "Boot Commander" we could run Windows, Linux, Sun Solaris all from a single computer. My memory also seems to recall my friend, Roger, managed to build a computer that booted no fewer than 4 operating systems.

Yes, those were the days :)

I set about figuring out how I wanted to serve my clientele. Not that any of them would care about my web server, my back-end database (MySql), or programming environment (PHP), or my choice of content management systems (CMS). As with any experience, I also wanted to learn something from putting these pieces together. Time is also an issue. Classes begin soon and our academic factory and machinery will soon begin injecting young impressionable minds with thoughts, ideas, facts, and knowledge.

Microsoft has an interesting utility, Web Platform Installer. "WPI" is a small utility which takes a snapshot of your server and responds with a smorgasbord of server tools, software platforms, databases recommendations, and content management software. Not only does WPI toss out recommendations but WPI will go ahead and install all prerequisite software needed to run any given software package or CMS.

For example, when I initially set out to setup my GIS Repository support site, I wanted to use WordPress. I selected WordPress from the choices presented and WPI began installing all necessary prerequisite software, MySQL, PHP, and IIS7. Very slick.

Very slick, except all did not go exactly as the directions promised. After examining logs of install details, re-reading directions, back to install logs, I finally had a working WordPress site.

[caption id="attachment_1226" align="alignleft" width="150"]Niklaus Wirth Niklaus Wirth, father of Pascal[/caption]

And, like a lot of things I tend to invest time in, I decided "Why use a platform I'm already familiar with? If I'm going to learn something, I need to use a CMS which is popular and not WordPress." I also have a constant recurring thought I need to keep learning as I might need a new job one day and I want to be able to tell a potential employer I have nothing against being exposed to new and different technology. I just don't want to program. I realize programming comes with all of the above details I simply do not want to code and be hired to code. There are people who enjoy coding and are good at coding and I am neither. I would rather change a light bulb on a cell phone tower, actually. Or, clean an alligator's pen. I've done neither of those tasks but I've done enough programming, Pascal, Fortran, Basic, C/C++, and ASP to know I can't see myself programming.

I dumped my entire WordPress installation. Not only did I dump my entire WordPress site but in doing so I managed to uninstall some prerequisites out of order and I botched my server. No worries; I simply rebuilt the server from DVDs, service packed and patched, and BOOM, I've got a clean machine again.

What to choose? I googled about a bit, running across a ComputerWorld article describing a CMS shoot-out. The article describes one man's challenges with WordPress, Joomal!, and Drupal. Initially, I had thought I would use Joomla! for no particular reason other than the exclamation point demanding my attention. After reading the article, the author seemed somewhat miffed and stymied by Joomla!. His miffedness carried over to me (sorry Joomla! users) and I opted for Drupal.

[caption id="attachment_1227" align="alignleft" width="142"]Drupal Am I allowed to use a Drupal logo on WordPress?[/caption]

Again, I returned to Microsoft's WPI to help expedite my installs. Ok, so why did I not bother installing all of these disparate elements individually? Why did I cheat, in other words? I leaned on WPI to help speed my time-to-live so I could better leverage my time. I am a one-person GIS support shop and have little time to invest in the bothersome details and nuances of some of the more complicated issues of config file editing and customization. I have a site to configure, develop, and launch very soon.

With WPI, I merely selected Drupal from the very long list of content management system I could have installed, selected "Add," and then I pretty much went home. When I return to my office later, I had a pending server re-start, which I did.

OK - not quite. I had to create a some database items, account names and passwords, provide a web site name and install locations, and then I went home, to return later and restart my server. WPI handled the details of installing IIS7, installing MySQL, installing PHP, and installing Drupal.

Part of yesterday, I reviewed some free Drupal themes, and opted for one called "Professional Business." I downloaded and installed, and soon had a new, fresh, customizable Drupal-powered intranet web site ready for me to play with.

Which is exactly what I did for most of today. My, how times have changed, the days of building pages from scratch HTML code, while not really history, seem quaint. All of my initial web sites I built from scratch, using mostly HTML with some Javascript to handle mouse-overs and other behaviors. Today, CMSs provide users an interface which removes some of the coding burden. I can tell already I'm going to need to dig out some HTML to arrive at the look I want.

The Discipline of Geography, and geography is a discipline, make no mistake. I ran across a geography forum where a student had posed the question "is geography a discipline or a tool? My friends say geography is a tool." My reply, tell your friends they're tools.

I have a Bachelor's degree and a Master's degree in Geography. I'd have a Ph.D in Geography if only to have more opportunities open to me but I allowed myself to be distracted and I fell off-course. To be a Professional Geographer today, and moving forward into the future, knowledge of technology is paramount.

In the early 20th century, geography involved exploring, getting your hands dirty - literally, drawing and mapping, learning languages and cultures, and numbers and statistics. But, you didn't have to learn technology; there wasn't any. The most complicated technology a geographer might have to use was a level and transit and maybe a sextant. Generally, though, the pen and paper and slide rule matched the technological limits for a professional geographer.

In the 21st century, all of the above still hold true. All the above, even the part about the level and transit; maybe not the sextant, though.

But, in addition to all of those details, to be a professional geographer one must also be able to swim with the technology sharks and learn the language of technology. No, I do not simply mean "megabytes" or "gigabytes" or "bits per second" (these are part of a geographers lingo, though), a professional geographer needs to know terms associated with bandwidth, processor speeds, color depth, mobile technology, graphics, network infrastructure, databases ... all sorts of technology.

Not only can traveling abroad make life interesting and the addition of a foreign language to one's skill set prove valuable, but professional geographers also add programming languages, like Python, or some flavor of .NET.

Professional geographer also need to have some knowledge of mobile technology and mapping. Hand-in-hand goes knowledge of iOS, Android, Java and/or Objective-C, the programming language of iOS. There are also numerous software development kits (SDKs) which can help geographers design and develop apps.

One aspect I enjoy as a professional geographer is the graphics arts and design of cartography, web design and layout, and general use of images and imagery. Millions of people are now using Google Earth, Bing Maps, Google Maps, and even now Apple and Amazon are leveraging their individual purchases of small mapping companies to build geography content into upcoming tablet devices.

Geography has something for everyone, but not everyone will be able to earn a living from being a geographer without also having a plan in place. By plan, I mean having put thought into utilizing the knowledge of all of geography's tools, methods, and techniques for fun and profit. My comment is true for nearly every discipline, though.

So, yeah, Geography is a Nerd's Paradise.

Wednesday, August 8, 2012

Book Review: Nemesis

Nemesis by Jo Nesbø. Harper Fiction. 2002. Translation 2008. Paperback. $8

"A man walks into an Oslo bank, puts a gun to a cashier's head, and tells her to count to twenty-five. When he doesn't get his money fast enough, he pulls the trigger. The young woman dies--and two million Norwegian kroner disappear without a trace."

And so our story opens. And, if this were only the story, the story of the murder of a bank cashier and Harry Hole's (Hole is pronounced "Hoe-lay") persistence in catching her killer, we would have an interesting story.

Jo Nesbø does not write in such simple ways, though. I recently flew to San Diego. The flight magazines in the seatbacks usually have a map of the airline's routes. Picture the flight lines connecting cities with other cities. Remember the complexity of those maps? Lines which stretched the entire width of the continental United States, and intermediate flight lengths, and shorter flight lengths? Jo is pronounced "Yo," by the way.

I think Nesbø must use those airline service route maps to develop the plots of his novels. Over the last year or so, I have moved from my traditional science fiction and fantasy milieu into procedural crime fiction. I began with Steig Larsson. He was criticized by reviewers for his highly intricate plots. I've read Henning Mankel who write good procedure crime fiction, also set in Sweden. James Lee Burke, who I have reviewed before, sets his stories in southern Louisiana. I've read forensic science authors Kathy Reichs and Patricia Cornwell. Not an immensely impressive resume of crime fiction; I would throw James Patterson in the mix, but honestly, his novels do not come close to intricacies I require. I read two of his Alex Cross novels and I am done. Science fiction and fantasy authors, by their nature, have to create intricate worlds, with themes, allegories, and complex plot devices to draw in their readers. We expect those traits. Having spent all of my childhood and the better part of my adulthood reading science fiction and fantasy, I have become accustomed to reading prose which has a high degree of complexity. The above authors (with the exception of Patterson) deliver to my expectations.

NemesisNesbø's "Nemesis" is probably the most complex procedural crime novel I have read. My opinion may change given enough time, as I only finished his book yesterday. However, I found myself having to re-read passages simply to understand the nuances of actions and details.

The bank robbery is not a bank robbery and a young woman is killed. Identical twin brothers commit a crime and one is caught and one escapes. But, which one killed the police officer and was sentenced to prison and which one got away? Hole killed his lover, or did he? And, what does the jailed twin have to do with her death? A woman commits suicide but on second opinion she might have been murdered by her jealous husband, jealous because she was sleeping with her husband's brother. Furthermore, why is Hole's former partner so intent on having Harry arrested for a murder he may not have committed?

Yes, if those details sound confusing you are not alone. Nesbø ties most of these tales together into one mostly complete novel. I say "mostly complete" because one loose end continues into Nesbø's next novel, The Devil's Star. Yes, I had to re-read some passages simply to keep my head in the game. Near the middle of the novel, Hole has pretty much solved the bank robbery, or so it would appear. From mid-point on the novel almost begins anew, with the introduction of a new character with a new motivation, and a desire to exact some form of retribution against Hole.

When the novel seemed complete at the mid-point, and new characters were introduced I felt as if one of the cardinal rules of crime fiction had been invalidated, something I will call the "trail of discovery." The Trail of Discovery means the author leads the reader through a consistent set of rules and parameters for the unfolding of events. Basically, all stories follow this rule. Authors cannot simply build a pattern of evidence and in the last three pages introduce a new character or new plot vehicle which renders all aforementioned clues irrelevant. Halfway through "Nemesis" I felt Nesbø was doing precisely this; what kept me reading was the fact I was only halfway through. Nesbø has an international reputation and I could not comprehend readers would popularize an author who committed egregious plot contrivances. So, I kept reading. And, the plot became for more complicated than I ever imagined. For good reason, too, as the robbery, the murder, the suicide, the murder all fold into one engrossing tale of lies, deception, and murder. Everything a good crime novel entails.

"Nemesis" was my first Jo Nesbø novel. My rule is I give an author who is new to me two chances to impress me and sell me on his/her style and story-telling. "Nemesis" is the second in at least a trilogy of Harry Hole novels, the first being "The Redbreast" and the third, as I mentioned earlier, "The Devil's Star." Now, I've got to pick up the other two.

Pax

Business Model? We Don't Need No Business Model!

Few things raise my hackles more than irrational or illogical behavior. And, in this collection I also include my own behavior. Take for example the recent news of a student, Timothy Arnold, a University of Central Florida student, who developed an application. His application, UCouldFinish, helped students find courses with open seats. Armed with this knowledge, over 500 students found acceptable courses in which to enroll. In higher education, 500 students x 3 credit hours equals 1,500 credit hours. Now UCF charges $208 per credit hour (site). Doing some simple math...1,500 x $208 equals (make this easy: 15 x2 x1000) about $300,000 dollars. Hypothetically, Mr. Arnold helped UCF bring in $300,000 with his application.

Consulting the UCouldFinish web site, Timothy has been subjected to punishment for being so bold as to help his university improve the conditions of his fellow students and also helping improve the financial environment of his university. Mr. Arnold's application earned him 3 semesters of academic probation, a research paper on the management of MyUCF, and counseling on how to make good choices.

To throw some positive vibes on this debacle, the 3 semesters of probation is not a big deal. Timothy sounds like a smart guy. He will either make those semesters pass easily, or he will bid UCF adieu and find a more suitable uni to finish his education. My sense is a company, like Google, or Facebook, or Twitter will offer him a job. Writing a research paper on the IT on the UCF ERP system will undoubtedly require some research. He should need to speak with many IT folks, network analysts, programmers, and system admins to get a sense of the breadth and depth of the inner workings of the ERP system in order to understand the impact his application had upon UCF IT. All of his research will undoubtedly result in more experience dealing with systems, programming, and the entire ecosystem of higher ed IT, giving him even more insight into the trials and tribulations of systems administration. Thus, in writing the research paper, he could end up with even greater knowledge of how to exploit...er, augment educational IT systems.

In a broader sense, Timothy Arnold v. UCF illustrates a much more important and sobering issue in Higher Education. Traditional higher education, public & private colleges and universities, are being assaulted on numerous fronts. Set aside for a moment the Conservation/Right-wing/GOP gutting of education, in general. The literal attacks committed against Education by Republicans is well-documented and is a true source of concern. However, critical thinking requires us to ask, Why? Why is their such animosity against Education, especially Higher Education?
I submit one essential reason: Higher Education has become sclerotic, hard, calcified, resist to change and adapt, entrenched in a management mindset at least a century old. For traditional colleges and universities to not only remain relevant but to serve current and future educational needs, the Higher Education business model needs to evolve from the 19th century academic mindset into a a mindset capable of being nimble and adaptive to 21st century educational needs. If Higher Education had been more vigilant in adapting and evolving new business and academic models, these specious attacks would have no merit

A propose a number of remedies to address the improvement of our College and University business models. First, classes need to be able to be added & dropped as needed. Doing such will require some transcript flexibility. Department chairs and deans are best able to support student advising and transcript adjustment. ERP systems need to be able be adaptive enough to manage student academic records, as well.

Colleges and Universities must be allowed to leverage faculty, staff, and student expertise. Unquestionably and undeniably, colleges and universities must be open and receptive to using on-site knowledge, experience, and expertise to address the daily and long-term mission of the institution. As Timothy Arnold attempted, when a need is identified, such as the notification system Timothy developed, those needs should be immediately addressed. Students should be allowed to help design and implement CMS and help manage institutional web sites. Students should be allowed to help develop institutional applications which help the overall function of the institution. One of the greatest complaints new graduates have, and have always had, is that of experience, having none. Students assisting with all aspects of institutional management, planning, and IT will have experience upon graduation. Perhaps experience with Drupal, Joomla!, PHP, Python, WordPress, Javascript, HTML5, SQL and SQL Server, or Oracle. Graduating student would have experience in application development, design, implementation, and enhancement. A phenomenal example of such a team is the InfoGraphics Lab in the Department of Geography at the University of Oregon.

Online courses, for-private schools, and now the growing presence of massively open online courses (MOOC), with the movement of so much content into "the cloud" is going to push universities to evolve into something new. Those who do not evolve, who do not revolutionize their educational business model are going to slide away into obscurity.

Current business models and the closely related academic models will need to evolve into something else. Global economic forces change faster than seasons or global climate change predictions. Business, industry, and global commercial sectors need employees who can adapt to change and work within fluid environments. Industry needs nimble thinkers who can address current business trends yet also be able to act quickly and foresee changes in technology, consumer tastes, trends in raw materials and labor markets which are embedded in Globalization.

Of current concern is unemployment. As of this writing, 12.8 million people were unemployed. The national unemployment rate is 8.3%. Again, doing some quick math, 12.8 / 0.083 and we arrive at a labor force size of about 156 million Americans, which is about 50% of our population, and that is my "rule of thumb" for determining the size of any country's labor force (about 50% of population, give or take) (Source: BLS).

Also, of this writing there are about 3.8million open job postings. We can debate the accuracy of the number, I will grant that. But, hypothetically, let's say ...entertain me here... 3 million of those open jobs were filled by people from the 12.8 million group. Then, we would have only 9.8 million unemployed. Simple math gives us 9.8 / 156 which equals 6.3% unemployment (Source: BLS)

Now, 6.3% is not bad. Not great, but certainly better than +8%. The question then becomes, "Why are those jobs not being filled by people in the 12.8million group?"

I don't have an answer to that question. I know some factors include unwillingness to move to where the jobs are, the pay of those open jobs, and the experience gap between the unemployed group the experience those open jobs require.

Retraining & re-education can help fill some of those jobs. Re-education requires colleges and universities to be open, fair, and honest with themselves, and be open to new ideas, fail fast, fail often, and learn and grow from those experiences, and be ever-evolving. Self-assessment mandates interaction with business and industry to make sure graduates have both the education and experience necessary for those graduates to succeed.

Higher Education cannot be a stove-pipe themselves, or become educational "silos" immune to local, state, national, or international influences and catalysts. Those institutions who pay attention will succeed by their graduates. Those institutions who fail to adapt to 21st century educational changes and evolving business needs will falter and suffer under constant mis-identification of goals and mis-aligned "mission statements" and find themselves under-going continual identity crises which will do nothing but undermine institutional confidence at all levels.

Institutions of Higher Learning represent a bizarre dichotomy of work environments. On one hand, some of the most brilliant people on the planet are employed by them, and some of the most pedantic, too. Even the educated folks are routinely bound by limited vision and are what I might label as "novice learners" being unable to see past their own discipline, specialty area, or responsibility. Thus, the very institutions which were created to promote critical thinking and openness are just as poisoned as any other institution or corporation.

So, yes, my hackles get up when these institutions victimize innovation, stifle advancement, creativity, and stymy new ideas and concepts, and squash young, middle-aged, an older minds in favor of "tradition" and "that just not the way its done" mentalities.

I invite all rational and reasonable comments :)

PAX