Marcus Technical Services, Inc.

Where Innovation Meets Excellence

(760) 840-7714

 

I am Stephen Marcus. I have been designing programs since the dawn of the computer age. A lot has changed since then, but fundamentals never do.

My company specializes in C# WPF and Xamarin Cross-Platform Mobile Application Development. We can provide you with:

  • Cutting Edge Mobile apps for Android, iPhone and Windows Phone using the amazing new Xamarin platform
  • Powerful and versatile WPF and WinForms enterprise-level business applications
  • Object-oriented code design using tried-and-tested frameworks
  • WCF Web Services
  • Rapid application development (RAD) using Syncfusion, Infragistics, DevExpress and Telerik
  • Agile software development, MVVVM, and design patterns.

When a new client shares their vision for a piece of software, we focus mainly on the concept and design. Once that is complete, the coding is nominal. If issues arise with the product, they can be remedied with an improved design. We are not speed coders. That is working with urgency but without respect for design.

Here are some shared reflections on the importance of this concept:

"Good design is obvious. Great design is transparent."

-- Joe Sparano

"Design is where science and art break even."

-- Robin Mathew

"Design should never say, 'Look at us.' It should always say, 'Look at this'."

-- David Craib

"Design is intelligence made visible."

-- Alina Wheeler

"Our mission is to inspire our clients' confidence by providing thoughtful insight into their issues, creative solutions to their problems, and greatness in every aspect of the products and services we provide to them."

-- Stephen Marcus, President

Why Xamarin?

Most mobile device owners are passionate about their phones. They're either a high-tech Android type or an aesthetic iPhone type. And they agree about very little when it comes to their metallic soul-mates. The companies are likewise conflicted: Java considers itself to be the ubiquitous, open-source programming language for the new age. Apple leads with breath-taking -- albeit dogmatic -- design.

For the past decade, the only way to develop an app for these two diverse platforms has been according to each company's complex and specialized language(s). It takes thousands of hours to master either of them. Few programmers can claim that they know both at extreme depth. So the programming world has remained divided both by both users and programmers. This translates to high costs for app development. And considering that most apps are free, many companies could not consider creating an app for each major phone. Until now.

Xamarin is a new type of programming paradigm based on Microsoft's excellent C# programming language. It includes three separate platforms for mobile development:

  • Xamarin.Android mirrors the Java language, but in C#, so anyone knowing Microsoft's language can create an Android app without re-immersing themselves in Java.
  • Xamarin.IOS does the same thing on the Apple side: using only C#, a programmer can now build an iPhone app. This is a tremendous relief because Apple's low-level mobile language, Objective-C, is nightmarishly complex.
  • Finally, Xamarin.Forms is a combined approach where the programmer creates the app using WPF-style XAML on the front end and C# code behind. Xamarin then converts this to iPhone, Android, and even including Windows. The final code produced is about the same as if it had been written natively.

Because Xamarin is based on C#, most programmers can pick it up and make progress in the short term. C# is a complete high level language based on classes and interfaces, so it encourages good programming practices. Apple's Objective-C is quick, but messy to code with. When Xamarin parses C#, it outputs perfectly designed Objective-C code without the programmer having to get their hands dirty. The same applies to Java.

Xamarin encourages code reuse at every level. They provide a Portable Class Library type that can be shared among apps. This reduces long-term effort, especially when developing multiple apps.

Xamarin's ambitions were greater than their original resources, so they have struggled to maintain a stable, reliable programming environment. But now that Microsoft has acquired them, they have the finances to support their promises.

Hurry Up (To Go Slow)

I once knew a real estate broker in Vancouver, Canada. The poor fellow was dying of cancer. I remember sitting with him at lunch as he mulled over whether to have dessert. "I may as well just order it", he said. "I don't know how many more desserts I will get."

I began to realize that the reason I spent so much time with him was that he was completely focused on enjoying the time he had left. He had long ago stopped making excuses. He also saw things more clearly than most people. He was much less distracted, oddly enough.

He and I were trying to wrap up a lease for a type-A personality named "JS". This guy was always going a million miles an hour. He talked fast. He walked fast. He drove like a maniac. But he never seemed to get anything done.

One day, in a moment of exasperation, my real estate buddy shrugged his shoulders, pointed at JS and sighed, "hurry up -- to go slow."

My friend has long since passed. But his observation rings in my ears every day.

We Can No Longer Absorb What We Experience Each Day

Every time I sit down to eat at even most modest restaurant, I find myself surrounded with television sets. I try to ignore them. If I actually watch one of them, I don't understand what is happening because the sound has been turned off. Instead, atrocious muzak blares from all parts of the room. This is meant to make the place feel "alive". My stomach asks for a quiet meal and I get a marching band along the road to my next stop, injestion-land.

Like any other fool, I have brought my "smart" (dumb) phone with me. A friend sends me clever texts:

hau

boss is pod

idc fh

g2r

c u l8r


I scratch my head and text back:

I'm not sure what you just said. Can you repeat your thoughts in English?


It takes quite a bit of time to add the proper upper-case letters, apostrophes, and such. I realize that I am spending too much time on it. I miss talking with my friends. It occurs to me that if we had just gotten on the phone we could have had a fun, productive conversation in about half the time that we spent texting. But nobody wants to participate.

r u kidding

dywmtgf


By the end of my meal, after much toil, I realize that I have just been asked, "do you want me to get fired?" {sigh}.

Back in the day, when you couldn't talk for free on the phone, we used to write letters. Sometimes they the letters would only travel a hundred miles, and they would take days to get there. But all of that effort made everyone sending or receiving one felt vitally connected to the other person. I still have the letters of my youth. When I read them, I feel like I have not aged a day. Every thought, every emotion, is as fresh as a flower. But the paper trail ends shortly after I bought my first computer. I wonder: where will I store all of these texts that I get these days? Then I think: what's to save? Over time, we'll all forget what we meant by any of it. Besides, my handy phone gets rid of them for me, so I don't have to feel guilty about it.

The same thing happened to photographs. I can't say the words "photograph" any more because nobody even remembers what it was like to get a whiff of developing solution after cracking open a fresh envelope of Polaroid film. I also still have all of my old photographs. They hit me even harder: I can feel the air, the sounds, of every instant. I also stopped taking real photographs some time back. Now I have images in the cloud. But they don't seem as sharp and genuine. A few weeks ago, a friend send me an Instagram of an amazing dusk scene at Yosemite Falls. I wanted to frame it. But I got tied up, and by the time I tried to look at the image again, it had been erased. It was "just a moment", he said. Maybe we should destroy all of those amazing Renaissance paintings, too. They're just old moments that we don't understand anyway.

We Are So Distracted That We Miss Most of What Happens Around Us

At lunch, in my finest diction, I asked my waiter for:

  • A one-third pound burger made from grass-fed beef.
  • Cooked on the juicy side of medium.
  • No sauce; extra mayo on the side.
  • Extra tomatoes on the side.
  • On sourdough; this is not on the menu, but the chef can use slices of sourdough bread that are normally intended for breakfast.
  • Red onions, very thinly sliced.
  • Half and half of (a) The standard lettuce blend; and (b) organic greens.

As I proudly finished this description, I realized that the waiter was staring at me. This was clearly not the same thing as listening to me.

"You want a burger and what?", he stammered.

The horrid background muzak was just loud enough to justify his deafness. But he was a young guy, with perfect hearing. The fact was, no matter how much racket was bouncing off the walls -- or even if none at all -- my waiter didn't understand my order because he wasn't prepared to hear it. That required careful attention and patience. People don't have much of that to spare any more.

I couldn't get angry; I have found myself doing the same thing at times. It's as if we must first acknowledge that a person really needs our attention, which we resist. Then, we finally give up and focus on them. But they have already said their piece, and we missed it.

It took three attempts to get the order into the waiter's brain. But the chef suffered from all of the same syndromes, so when the burger arrived, it was over-cooked, the onions were in big chunks, and they had forgotten both the extra mayo on the side as well as the tomatoes. So all of that had to be redone.

As I left the restaurant, I realized that the owners of the mall had, in their divine wisdom, piped the obnoxious audio throughout the entire plaza. In this way, no one could say the place was "dead". I was not amused. But most people did not seem to mind. They were much better at tuning it out.

We do this sort of thing every day. Just look at how we read. I grew up reading. I poured through classic science fiction before I was twelve years old. And I do mean: I absorbed ever single word of it. Today "reading" means that we look at something that has a bit of text attached to it, which we generally ignore.

I often send detailed emails to my colleagues. I use the same sorts of tricks as I have here: I keep the paragraphs short. I use bullet points. I repeat key thoughts. I ask questions. I make conclusions.

When I get an answer back, the person almost never responds to the entire email. They had only skimmed the thing, perhaps on their "smart" phone (and how smart is that?). Reading takes time, which they don't have. The more they read, the more they have to work. So that's another excuse.

This is hardly what we can call "efficient" communication. Each time I get a partial response back from an email I have to write back and ask again. This eventually rubs someone the wrong way, since they do not imagine themselves to be sloppy, careless or irresponsible.

When was the last time you wrote to some company's tech support department via a web form or via email? No one is even mildly surprised that these communications are ignored. That's the new norm. Of course we have an easy target: the people answering the email or phone are usually foreign workers. They don't understand enough English to thoroughly comprehend and respond to the fine details of a complex issue. So they read off of a tired script that is intended to pat us on our butts and send us on our way -- with nothing resolved. Ironically, in their own language, they are quite fastidious and well-studied -- unlike us.

We Are Too Busy To Make Real Friends

As we get better at tuning out noise -- which includes most other human beings -- we become starved for attention and approval. We try to fix this by increasing the number of contacts we have, and nervously chattering with them. We don't even have to take the time to speak or write English. We have weblish, the new language that has no spelling or pronunciation rules. You know, like when you were two years old.

We communicate with our new "friends" in one-way utterances. To avoid wasting unnecessary time, we tweet our every thought. Now there's a smart company: they force us to say something in 140 characters or less. They have cynically calculated the exact length of our attention span.


In today's high-speed, virtual world, we can say anything about ourselves and it becomes an instant reality, even without proof. The proof is in the utterance. The proof is in how many people listen to us. That lends to the illusion that they believe in us.


It's like drug addiction: when we're high, we have a ton of energy. We can't shut up. But we also can't focus, and we lose our boundaries. So we interrupt, we ignore, we brag, we posture. But we never make a genuine connection with another human being, because that would require us to be completely present. To begin with, we'd have to stop taking drugs.


The drug I'm talking about is not cocaine or heroin. It's our cell phone. It's the only companion we really need. What else is really always by our side? What else can we not live without? What else always makes us feel important -- "I have to take this call" -- ? What else allows us to interact with another person without real personal involvement, without risk, and still call them our "friend"?

One can foresee a future in which we will not need real friends. Instead, we'll have robots that do everything with -- and for -- us. They will never argue with us. They will rely on us completely. Oh. Wait a minute. Isn't that a dog? Well, these new dogs will not make us follow them around with a plastic bag around our left hand. In this new reality, we won't need human beings any more. And nobody else will need us.

This answers the question: if we were stranded on a desert island, and had to decide between having a smart phone with a permanent battery and a clean internet connection *or* a human companion that we had never met (and whose behavior we could not predict), which would we choose?

The problem is: what will this do to the quality of life on planet Earth?

When I was a teenager, I took a 1500-mile bike ride. I was seriously ill the whole way back. My best friend said he would meet me downtown and ride home with me. I was supposed to get into town at 6 p.m. I was four hours late. This was in the days when you had to use a "phone booth" to call another person, so we had no way to reach each other. But that didn't matter to my buddy. He wasn't trying to "do" something or "go" someplace. He wasn't in a rush. He had given his word that he would wait. At ten p.m., in the pitch black night, I pedaled up to him and shook his hand. I can still remember the sheer joy of seeing my best friend, of having him by my side, as I finished that trek. Once you've experienced that reality, anything less feels utterly false.

I won't exchange my world for this new one. I am hiding out.

Hang on. My phone is ringing.

Why I Don't Blog

Initially, we were all told that the Internet was an uncensored public repository of human knowledge. That was a grand idea. But for the youngest citizens of the world, the definition felt a bit stodgy. So they snuck the word free onto the front of every imaginable feature. Free music, free videos, and free news and commentary. The only way to pay for this "free" resource was to allow online companies to tee off on innocent consumers by making them search for everything, then presenting advertising at every corner. And the only way to acquire enough content was to encourage everyone to publish constantly, regardless of value. So every Internet user became an instant expert on any topic that tickled their whimsies. These articles are now called blogs.

Even Bloggers Don't Really Want To Blog

Like any 8-year-old with a new toy truck, today's bloggers quickly tire of their doodling. They also realize that they haven't got much to say. They've been too busy blogging to actually study anything, or to develop a valid skill. So they begin regurgitating each other's spontaneous impressions of any topic that arises. Even that grows tiresome, so their activity becomes spotty. Two-thirds of blogs are inactive after their first year. Not that the blogs themselves go away; the Internet never learned how to clean up after itself. All of this junk just keeps accumulating. Each time you search, 90% or more of the results are so dreary that even their authors have forgotten about them.

Corporations, drooling like wolves at the idiot sheep they encircle, have pumped millions of pages of false content onto the Internet to promote their wares. Most of the reviews on major websites like Amazon and eBay are rigged. Any time you see a website called "Top Ten Review", your stomach reels at the stench of cynicism and greed. That will also never get scrubbed out. If some trusting citizen stumbles by and clicks accidentally once a year, that's money in the bank for a corporation. Besides, it would cost more to clean up their mess than it did to create it in the first place.

Web hosting sites also encourage this malfeasance. The reason you don't see an RSS feed on my site is that I don't blog. The RSS paradigm apparently considers pages as "filler", and blogs as "exciting new content". Nothing could be farther from the truth.

Nobody Remembers Dates, But Everybody Wants To Know What's Happening Now

Every blog is issued on a given date, which is the main index for that content. On websites, you are often asked to choose a date from a calendar, and the blogs for that date are then displayed. This is an example of how stupid the Internet can be, and through what appears to be a logical process. Nobody who escaped Civics class in high school has since spent a single minute of their existence classifying their lives based on the dates when they said or did something. After a certain age, we don't even celebrate our own birthdays. So who cares what date a blog is published, especially considering what tripe it contains?

What the Internet generation wants is the latest, greatest thing. So blogs are dated to let everyone know what's hot and what's not. It doesn't matter what anyone says. They have to say it within the last day for anyone to even listen. With the advent of Twitter, that trend is moving towards whatever is said within the last few minutes. This is also the ultimate testament to the Internet's hapless vapidity; why spend time crafting human language if you can just exclaim something amusing or interesting every hour? For young people, the answer is clear: say whatever is on your mind as soon as you think of it. The more chatter, the more excitement. The combined noise is their sort of "wisdom".

What sort of value system do you get when everything you say is replaced by someone's next instant thought? Human knowledge accumulates from its own painful history. If we trivialize what has passed, we will forget those lessons, and will fall victim to the errors of the generations before us.

How Would You Publish a Classic on the Internet?

Consider these books:

  • Shakespeare -- Combined Works (William Shakespeare, circa 1600)
  • Don Quixote (Manuel de Cervantes, 1605)
  • Moby Dick (Herman Melville, 1851)
  • Leaves of Grass (Walt Whitman, 1855)
  • Huck Finn (Mark Twain, 1885)
  • The Time Machine (H.G. Wells, 1895)
  • Heart of Darkness (Josef Konrad, 1899)
  • Ulysses (James Joyce, 1922)
  • 1984 (George Orwell, 1949)
If'n I could get folks to read my blog, I could make you seem cool!

Let's play nice and pick the most benign entry here, Huck Finn. It's 1885 in America. The Internet started quite early (!!!), as it is in full bloom already. Everybody has a computer. Mark Twain wants to publish a book he wrote about a young boy who likes to get into trouble, but usually learns something as he's trying to con his way out. Everybody cusses profusely, though the practice is frowned upon by formal society. Huck also treats a runaway slave as if he was a normal human being, deserving of respect. This at a time when black people were being treated as farm animals. Mark twain is a clever man, but even he finds the process daunting:

  • He can't publish it as a book because the Internet has replaced books. The only books being sold are romance novels (so much has changed).
  • He can't just print it on the Internet because it is way too long, and no one will read it.
  • He certainly can break it into a hundred or so blogs. But the work would get jumbled and would not seem tightly related. If someone missed a day, they could lose the entire meaning of the story. The Internet doesn't provide any real means of keeping things in order or in context. People are always confused about what they have read. If something seems familiar, most readers will just skip it.
  • Twain also realizes that because of the Internet's viral self-publicizing nature, the work could bring a negative reaction from the public. The dirty language would cause a sensation. But that isn't why he included it. He just wanted to reflect how people actually spoke. Also, he feels it is inherently funny.
  • Meanwhile, the Internet bloats each day ever fuller of the tired words of dull individuals whom Twain detests. Their success seems boundless.

Of course, Mark Twain went on to become quite famous. But that was because he had publishers who believed in him. Book publishing was expensive; the editors became the guardians at the gate. They had to decide whose voice would be heard. Thankfully, they were generally altruistic. They wanted the world to be a better place. But to be brutally honest, publishing was still an "old boy's club". Women and minorities were often excluded from the process.

One can only wonder what any of history's greatest authors would have thought of today's Internet --- a place where any idiot can publish side-by-side with them, regardless of the quality of the content. Indeed, if the blogger dates their article today, it's news. The Internet purports its viability by creating new things every second of every day.

The Internet Needs a Garbageman

-- Or maybe just a new one. Google and their cohorts certainly could have kept a lid on this. So much for that. Their bean-counters quickly realized that a bigger Internet meant more opportunities to frustrate users with futile attempts to find things. Each time one of us clicks that search button, ka-ching! Money for advertisers, who often themselves pose as "results". Right. Resulting from bribes paid to Google.

Unfortunately, the only reliable remaining candidates for this delightful job are you and me. For my part, I have:

  1. Refused to become just another "blogger". This site contains no "posts".
  2. Created articles that completely discuss a given topic, which is always clearly stated.
  3. Removed all dates from my site, regardless of the consequences.
  4. Allowed each piece to grow to its necessary length, regardless of word count.
  5. Announced new content to my subscribers; otherwise, I never contact them.
  6. Promoted my writing via LinkedIn and Facebook, but only through passive links.

By Stephen Marcus

Is Your Software Project Killing You?

The Horse That Ran Down A Staircase

I am often brought in to review huge enterprise projects that have fallen completely off track. Not only are these programs over budget, but they don't work as originally conceived. Users have grown frustrated and unproductive. Management wants to know: what went wrong?

Imagine building a skyscraper with a $1 billion (US) budget. If you make a mistake late in the construction, what would it cost to repair? Or put another way, if you had hired an architect for that project, how much would you sue them for? And would that really help? Your client would still fire you, and your reputation would be lost. You would spend years trying to achieve remedies in court - both as plaintiff (against the architect) and as defendant (against your former loyal clients). In many ways, the cost would be incalculable. This is why so many companies fail after they make major changes to their operations, including where or how they conduct their business, and certainly including major shifts in what they sell. This includes software development projects, which are some of the most complex tasks ever undertaken by a company.

The tough answer to my clients' question is that they are like a horse down a staircase -- they have committed so completely to what they are doing that they cannot back out (imagine the horse trying that) or change their neck-breaking downward trajectory.

The most common causes of a failed software project are:

  • The client failed to start with an independent program architect, or hired the wrong individual for that key role.
  • The client couldn't afford one. But could they afford to completely and utterly fail in their development efforts? The answer is always, "Well, now that you mention it..." A great architect pays for him/herself in direct savings that result from proper planning, design and management.
  • The client couldn't find an architect. It is hard to reach out and locate expert help when you need it. The architect has to possess complete knowledge of an extremely complex arena that changes each day. They also must learn the company's business so they can tailor a solution to the firm's needs.
  • The client didn't want to deal with an "outsider". When you open the doors each morning, and watch the familiar faces pour into your building, it's hard to imagine that any person who is not a member of your culture could assist you. But this is the actual benefit: the architect gets to see you for the first time, without bias. They see with clear eyes. Most businesses can leverage this valuable perspective when they are committing major funding to an arena as challenging as software development.
  • The client never interviewed (or qualified) their own staff based on the actual requirements of the software project.

No major human achievement has ever been successfully accomplished without an architect. But virtually every disaster has been blithely uncontaminated by one.

"We've got lots of smart guys in here" is the usual explanation. So when the company decides to move from its ancient web pages to an ultra-modern client-server project, who else could be more suited? Answer: anyone with direct experience in the new technology, who understands programming philosophy, and who obeys fundamentals. When was the last time you evaluated your staff on that basis? "We've been too busy for any of that touchy-feely stuff"...

You owe it to yourself to understand exactly who you have hired and what they are capable of. This not only affects their current assignments. It also indicates whether they can move on to ever more complex and conceptually challenging projects. In many companies, the people working there are the "leftovers" -- individuals who lack the energy to advance their knowledge, but simply wish to pay their bills and keep their jobs. Business owners should not assume that those workers will somehow "rise to the occasion". Have they exceeded expectations previously? Have you had their work reviewed by an architect? Have they show ambition and drive, or have they generally just agreed with whatever you suggested? Every business needs "a little disobedience" from its brilliant and somewhat untamed staff. If you're not getting it, maybe your employees are asleep.

Insidious Alliances

Modern bureaucracies are based on alliances rather than capabilities. Those who speak up, or exceed the mundane performance of their co-workers, are quickly isolated and expunged, or get frustrated and move on to better opportunities. Unless you have taken steps to protect high-achievers, your company has probably already devolved down into this modern-day version of Lord of the Flies. The problem with this environment is that it forms a classic downward spiral: over time, your staff realizes that the only way for them to survive is to protect each other. So they work at the level of the dumbest member, and at the pace of the slowest member. Even managers get sucked in: they build their own alliances by approving the incompetent work of their subordinates.

Part of investigating a "software nightmare" is to ask the programmers what they feel is wrong, and how to recover the project. The single most alarming fact that arises from these inquiries is that everyone can eagerly point the finger at someone else. Also, the software development process sounds eerily similar to the way one thinks of an old building: It's always existed; it can't be torn down; the only thing anyone can do is patch the old thing up and pray it does not collapse around them.

After this initial muddle, I talk with the managers officially assigned to the undertaking. They complain that upper management, demands a finished product without an adequate budget or a fair schedule.

Finally, I analyze the facts, and eliminate the opinions. In virtually every failed software project, this is the picture drawn by the cold, hard truth:

  • The project lacked a clear development plan, including a reasonable budget and valid timeline.
  • Management failed to hire a qualified architect.
  • The program evolved spontaneously, using miscellaneous effort from mostly unqualified programmers.
  • Because of changing requirements, the project stalled and so never achieved its originally (vaguely stated) goals. Meanwhile, the budget bloated out of control.
  • Once the downward spiral had begun, no one in the company had the will to stop it. The topic of actually redesigning the software was so politically incorrect that no one had the courage to raise it.
  • Anyone attempting to change the course of the project got labeled as a "malcontent".
  • Expectations plummeted.
  • The company cut and eventually canceled the budget.
  • The responsible personnel -- especially managers who oversaw the entire process -- were never punished. Instead, they received new assignments that eerily resembled the ones that had failed.
  • Their accounting department wasn't of much help.

    Every major corporation has an accounting department, and you would certainly think they could calculate the cost vs. benefit of the things they purchased. Yet they sometimes behave as if they don't even know what a calculator is. Here are some juicy gems that I have witnessed personally over the past ten years:
    • They approve the hiring of over a hundred programmers, but when asked to provide essential tools to help those individuals perform more effectively, they respond that such expenses are "unaffordable". Sometimes the tools are even free; this does not seem to matter. The entire topic of acquiring tools is taboo. If you take the average programmer's gross cost of around $120k per year, and figure that they could increase their accuracy, efficiency and effectiveness by at least 25% with proper hardware and software, it is easy to calculate the break-even: $30k per year. But the budget for new tools is nearly zero. If programmers struggle, are inefficient, and do sub-standard work that damages the company's long-term objectives, that is like talking about next month's weather with an accountant. It is not certain, is broad-sounding, and resides in the nebulous future, so cannot be calculated with their methodologies.
    • They authorize contracts that require programmers to work long hours, including unpaid overtime, because that fits their budgets. But one day the Labor Board finds out and shuts the company down because of illegal practices. The company is then sued on behalf of the workers. The accounting department blames this on the legal department, but the entire company signed off on mistreating their workers until it bit them on the proverbial posterior.
    • When a temporary programmer achieves greatness in his tasks, he is sometimes offered a salaried position. But the offer is inferior to industry standards, so the candidate refuses and moves on to work elsewhere. The company loses a valuable asset because it does not know how to quantify one person except as "one potato divided into all of the mashed potatoes we make each year". This is another way in which the staff at major U.S. corporations devolves into incompetence over time. The workers that remain are the sheep that have learned to keep their mouths shut (except to chew their cuds).
    • If one software project requires planning and design the accounting department chimes in, "Is this liberal arts stuff really necessary?" because no measurable change has taken place. Meanwhile, the same group cheerily authorizes any development assignment where no planning takes place, but instead the team charges blindly forward, announcing their progress in daily scrums and piling thousands of lines into the code base. Sure, the spontaneous project will probably fail, but that event will occur in the future. It won't affect this year's budget.
  • They failed to heed the warning signs of a project gone bad.

At the risk of sounding like a late-night talk show host, you know your software project is going terribly wrong when:

  1. You are having trouble talking about what you are doing because you assume "it's too complicated for normal humans". Complexity is not a goal of software design. Simplicity is. If someone can't describe an important aspect of your project to you, it is usually because they are testing your gullibility. Once you "pass" this test, you are doomed to receive a barrage of cryptic explanations of why nothing works and cannot be done on time or within budget.
  2. The program has not been completely described by requirements, or those requirements have begun to seem vague, as if they were cut-and-pasted from online samples. Your planners should have laid this out in extraordinary detail.
  3. You receive advice to adopt special management techniques like scrums where your workers get treated like military cadets. The goal is to "keep the boys in line", but all this ever engenders is resentment. An efficient programming staff does not have time for unnecessary meetings, and should spend zero percent of its time trying to make management "feel better".
  4. You realize that the planned approach to developing your software failed to remain open to obvious changes in technology and process within the industry. This means that your architect was an "old school" type that figured, "This is how we did it at IBM.."
  5. Your customers are not directly involved in the software development process from the start. These are the individuals who will have to buy and make use of the product. If they are absent, so is most of the information you need to make them happy.
  6. You have three or more managers in charge of the project, who spend their time in endless emails and meetings trying to demonstrate why they are essential to the process. This creates an irritating "noise" for those who are trying to produce the actual finished product. It also indicates an obsessive control disorder that many modern bureaucracies suffer from. The most happy and energetic companies I have seen actually don't use management as corporations do. The owner is usually present, with an assistant. Anyone can talk to "the big chief", but they also answer to them. Modern management creates a competition to identify the most obnoxious and demanding individual; the lucky winner gets to run the place.
  7. Your development teams are large, preventing them from acting efficiently. This also encourages managers to treat them as a mass group, without individual voices. Your programmers are the only reason you exist as a software development entity. They have a lot to say about how to become more efficient and effective, if anyone would listen.
  8. Your staff fails to answer emails or other forms of communication, or becomes vague in discussing the project within the team. Senior members become arrogant and distant because they feel indispensable to the outcome. You begin to feel that they are actually running your company out from underneath you (they are).
  9. The staff has to work overtime just to keep up. This usually means that the project is improperly planned and designed, and/or that the works don't know what they are doing. A proper use of overtime would be to sprint to the finish of a quarterly deadline.
  10. New programmers take 3 months to become competent at the assignment and 6 months to become valuable. This reflects a cumbersome, unmanageable code base. If I had the choice, and some devilish attorney could make it legal, I would issue a polygraph test to every new programmer hired after 3 months. They would have to answer honestly, or would be fired. The answers would, however, be thrown into a hat so we did not know who said what. The questions:

    • Is this software competently designed?
    • Should the entire project be scrubbed and started over again?
    • Who is valuable on the software team, and who should be fired?
    • What can we do to make this program great?
    • What can we do to help you maximize your potential?
    • If you received a job offer from another firm, would you take it?

    I would give these interviews 10x the weight of anyone else's opinion on the staff, including my own. If the new programmers are properly selected, their impressions are priceless, if the company listens to them.

  11. The code base is improperly documented. You can actually judge code comments as an amateur (I'll write about it soon). But that's your architect's job. The inability to describe what you are doing and why you are doing it reflects a lack of conceptual skills, which are more important than rote technical knowledge.
  12. When bugs arise in the program, they get recorded and fixed like isolated issues that will not take much time. In a professional environment, a bug is a "kind of problem" that is symptomatic in the location where it is found, but almost never isolated there. Virtually all bugs are design problems, and so must be addressed at that level. This is why your programmers should possess a 50/50 blend of conceptual and technical skills.
  13. The coders rush their work. Their goal is to achieve a result through symptomatic testing. The proper approach is to plan and design the requirements, the project, and the code as essential pre-steps to actual coding work. The faster a coder works, the sloppier they work. Yet when I ask managers to show me their "best coders", I always get introduced to someone floating on a jagged cloud of crystal meth. The calm, quiet, deliberate workers get labeled as dullards, though they do the best overall work.
  14. The programmers talk about "refactoring" the code "when they find time". This is the worst single statement any programmer can ever utter, as it indicates that they did not design their code properly in the first place, or they are covering up for someone else's incompetence.
  15. The project requires complicated steps to produce and release for testing. This is the most frequently underestimated area of development because so much goes wrong so often that the staff gets numb to the problems and acts as if "we'll handle that on release". Yep, by your very angry clients.
  16. The customer must traverse a Mount Everest sized learning curve to adapt, convert, and use your program. They are too busy to begin such adventures. Besides, like all human processes, it involves variation and complexity that sucks up time like the desert sucks up moisture.

Management gets isolated at the top

"Uneasy lies the head that wears a crown."

-- Shakespeare's Henry IV. Part II, 1597

Most people fantasize about what it would be like to own a business. It's like wanting to own a yacht. Yet there is that other saying: "The happiest two days in a sailor's life are: (1) the day they buy their boat; and (2) the day they sell their boat."

Running a business in the long-term is also like being an Indian King: "My slave is my master". So I have a lot of sympathy for the founders of companies, as they want what everyone else does, but they have the courage and patience to stick with it, even when it costs them the most productive years of their lives.

Business owners (or top executives) also suffer from some interesting maladies that the rest of us can't relate to:

  • They want to achieve their vision so badly that they sometimes lose their vision -- that is to say, their judgment. They can be capricious and reactive. This also occurs when you are emotionally exhausted after many years of struggling with the physical realities of running a company.
  • They become so desperate to unload their burdens that they over-trust the help that they do hire, and so get taken advantage of. They hire the managers that hire the managers that hire the managers that oversee programming projects that can scuttle the entire business. In their attempts to delegate, they often lose control of very dangerous situations involving huge amounts of money, as well as their personal reputation.

By Stephen Marcus

Is the Internet driving us to distraction?

My father died before the computer age really began. I was a young man during that bleak winter in 1982. Imagine my guffaws if you had told me then that within 15 years, everyone would own a personal computer vastly superior to the one they used for the moon landing, and that within 30 years, the hunky beast would shrink into a tiny device called a "smart phone"? And that we would spend all of our time interacting with something called the Internet, which no one understands but nobody can live without?

As Paul Harvey might have said, what actually happened is "the rest of the story".

Does anyone really remember the rest of the story? It comes to me in flashes:

what.is_.personal.data2x519
  • When I was in college, the only computer I remember is one that took up an entire building. It counted attendance and determined who got into what class. The monster was so massive that it required special air conditioning and constant maintenance.
  • About that time, somebody said they were doing some reading about an electronic publication system where people could send articles and maybe even get paid for doing that. I wasn't sure what they were talking about. It sounded like just another scam.
  • My brother graduated with a Master's in math and computer science, and landed a great job at a prestigious firm where they created IBM mainframe "software". It was so complicated that we could never really discuss it.
  • After my father's death, I bought my first "computer". It was expensive, and difficult to use, but wow, what a device. It made a calculator seem like a toy.
  • My knowledge of computers and software grew rapidly from there. I wrote visual basic scripts to change the way Microsoft Word and Excel behaved. I designed and coded a program in Pascal that eventually ran my entire wholesale business.
  • In the 1990's, I heard about a new communication tool where we could write electronic messages to each other -- "email". This was also when I first heard about the rush to claim domain names so you could have a public "website". The technology was so new that it required specialists to put it all together. But it was fascinating. There was a real sense of acceleration.
  • In the spring of 2000, the Thai government unexpectedly devalued their currency, the bhat. This caused a worldwide sell-off: $3 trillion evaporated out of the stock market. The blazing-hot NASDAQ, with all of its high-tech startups, collapsed like a dying star. I was standing a bit too close to the flames; I lost my business and fell flat on my face.
  • My interest in programming had grown so much that I changed my profession and became a programmer full-time. Cell phones were still clunky and expensive, but we loved them. We started thinking that the future would be like Roy Rogers and the science fiction we had grown up on.
  • Everything since then has been a blur. The world is moving so fast that we no longer record memories the way we used to. So here we are.

Attention Deficit Disorder Is The New Normal

Where are we? For one thing, we are riding the wave of an explosive information revolution. That sounds both threatening and exciting. Until you do a search on the Internet and get hustled and lied to by people you will never meet (thankfully).

Everything I get off the Internet is chronically broken and over-hyped. Even when it's free, it's a bad value, because it takes too much time to manage. That's because the only way for companies to get attention is to offer something for free. They can't spend any real energy on it, so the thing they produce is more Internet junk. They also don't have the time or money to support it. So they sell it through their clinched teeth and hope for the best.

So much information is now being published that no person could ever consume it. But why would the want to? To purge their stomachs? Look at the advertising for Search Engine Optimization ("SEO") services. These companies claim that they can get your website ranked high enough to attract real traffic and maybe even hustle a few bucks for yourself. But how? Oh -- they just create a bunch of dummy content packed with keywords, and have all that tie in to your website. Harmless stuff, right? Unless you use the Internet. Talk about putting a fire out with gasoline!

We have grown so accustomed to the assault on our sense by television, radio, and now the Internet, that we no longer actually feel it is worth our time to listen to any one thing with any priority or patience. It's all garbage to our brains at this point, so the only way to cope is to give less and less energy to any one thing at one time. Doctors used to refer to this occasional malady as Attention Deficit Disorder ("ADD"). Now it's just being alive in the 21st century.

Welcome to the Something-for-Nothing Generation

So what does the future hold for a society driven by its own impatience to get more and more out of less and less focus?

  • My father was a carpenter. I learned the trade from him, and practiced it for a decade. Who would do that today? They would want to watch a YouTube video, and go out and buy a power saw. Let's just say that the bloody fingers will be a-flyin'. And forget about getting your door fixed.
  • Are we really getting any smarter? The only kind of material that gains traction with the public any more are viral videos, gossip, and something-for-nothing hustles. Web developers have chopped their content into bite-sized marshmallows. No one has time to study anything. So where will our real intelligence come from?
  • I bought dinner for two young women recently. One was older and quite gracious and respectful. The other was a young brat who could not stop pecking at her cell phone. Eventually I grew so irritated that I asked who she was communicating with. She claimed it was her "best friend" The problem was, they had never met.add
  • How will the next generation handle real inter-personal relationships? It's great that you can exchange live texts with someone from Thailand. Have you read the Economist lately, to understand that country's political turmoil? I am guessing that if your new play-mate brought up any of those unpleasant topics, you would have to "unfriend" them for making you feel guilty and thoughtless at the same time. Owie!
  • I grew up middle class. Everything in my life came directly from my parents' efforts. That was the core ethic: we reap what we sow. Who is producing any value any more? And what will they receive in return?

By Stephen Marcus

Welcome to the "Google Net

Don't worry, mom. I'm just playin' with the Internet!

When I first jumped onto the Internet in the early 1990's, I used Yahoo as my search engine.  If you had asked me then where Yahoo would be in 20 years, I would have said, "the top of the world".  But the Internet is complex and therefore prone to expected twists and turns.  Yahoo barely exists.  They are best known for demanding that their lazy staff show up to work on site, indicating a complete loss of control over how things get done there. Meanwhile, a cock-sure late-comer with a funny name -- "Google" -- is indeed at the top of the world.  I would have bet against that, considering the competition.  Google's rise describes the Internet's own transition from an innocent, friendly puppy to a cynical, conniving boa-constrictor that squeezes us a little tighter each day.

Is There Some Reason That We Have To Search for Everything?

Dave, you're not searching enough! You don't want to jeopardize the mission, do you? Daisy. Daisy, give me your answer do...

The Internet began by allowing us to look for things.  But it's never grown up.  Sure, it was cool to dig around for a few hours and find some unusual content on the "net".  But that's only because we had no other choices. Imagine getting up in the morning and searching for your clothes.  Wouldn't that get irritating?  You would expect to have them organized in some way that made them easy to find just base on the type of occasion (business, casual), and perhaps even by color and/or style.  That can be done inside one physical dressing room.  So searching for your clothes is not a service to you; it's an insult and a waste of your time.  The same rule should now apply for the 20-year-old Internet.  This kid's grown up, but still behaves like a messy adolescent. The information stored on the Internet is so confusing that everything must be searched for as if it were lost.  This is supposed to be taken as the natural state of data.  It's not.  So how did it get so "lost"?  And why?

Why can't we ignore search results that we've already visited?

Let's say you go onto Google and search for turtles.  You just love the little crawlers.  You open dozens of links and read all about them. A few days later, you want to continue your research.  But surprise, surprise: the same basic results appear.  The links you visited are marked in a purple color, but they're still  listed as a result.  You look through the Help system to find some way to search without getting results that you've already read.  But no filter is available to remove visited links.  Why would they omit something so obvious?

Guess who's coming to dinner?

Let's defer to an episode of Rod Serling's ground-breaking sci-fi show, The Twilight Zone, called "To Serve Man".  Aliens come from outer space, claiming that they just want to "serve man".  At first they are greeted with fear and skepticism, but the slogan wins mankind over.  We agree to visit their planet.  The space-ship is loaded with eager earthlings.  The passengers wave to their families, smiling, looking forward to their new lives with the aliens.  Then suddenly a reporter bursts into the airport.  He has translated the alien document entitled "To Serve Man".  It's a cook-book.  Everyone shrieks.  But the space-ship cannot be stopped.  It takes off, fully stocked with fresh meals for the flight back.

This may sound grim, but so is capitalism.  Google and their pals are not here to serve you.  They're here to eat you.  But first they must ask your permission.  That way, it doesn't seem so rude.  Every experience you have on the Google Net is tarnished by this amoral philosophy. The answers to the questions posed here are:

  1. The reason that everything is lost on the Internet is that this causes human beings to search endlessly for things that could easily be organized and found without much effort.  Each time a user searches, Google sticks them like a mosquito and sucks a little blood (in the form of irritating advertising).
  2. The reason that searches do not exclude visited links is that this causes you to mistakenly repeat yourself, which means more time on web pages.  More mosquito bites.

So Why Blame Success?

It's a completely fair game. Except I own everything, and I've got this pitch-fork...

Success is great! As long as individuals and corporations play fair.  That means honoring the spirit and intention of our capitalist system: No company shall create a monopoly within an industry.  This always results in poor service, inferior products, and higher prices.

When was the last time anyone said that Microsoft makes "great" operating systems?  Or that they really love the eBay site -- which hasn't improved in at least ten years?  These monopolies don't exist because of excellence. They achieved their dominance through shrewd trickery, and now maintain it because their would-be competitors are long-buried.  Back in the 1990's, Microsoft bribed PC manufacturers to install Windows as the only operating system on their new machines.  Sure, Microsoft got sued, and they paid a nominal fine.  But the damage was done.   That's why the practice is illegal.  It destroys capitalism and harms consumers.

The Robber Barons Are Dead!

Remember to watch your manners. Always ask: "Do you want to give us your money, or get shot in the face?"

America was built by greedy capitalists. They were known for their ruthlessness. Each managed to dominate an industry in a way that violated U.S. anti-monopoly laws, but could be achieved through bribery and coercion. They were criminally cheap as well; their workers were underpaid and mistreated, and child labor was common-place.

Name Wealth In Today's Dollars Monopolized Industries Era
John D. Rockefeller $336 billion Oil Late 1800's
Andrew Carnegie $309 billion Railroads; Steel Late 1800's
Cornelius Vanderbilt $185 billion Railroads Mid 1800's
John Jacob Astor $110 billion Real Estate; Fur Early 1800's
Jay Gould $71 billion Railroads; Gold Mid 1800's

These "robber barons" took huge risks, which paid off in the short-term. But their manipulations often caused crises for the economy in general. The Panic of 1873 was caused by the failure of the Northern Pacific Railroad, which had been funded through fraudulent (and worthless) bonds. Ironically, that panic bankrupted many of the robber baron "brat pack". They could not see how their actions could turn into disaster. Some say that the robber barons' meddling actually caused the Great Depression.

The public got fed up with the top 1% of the population owning much of the country's wealth. There was indeed an income tax law dating back fifty years. But it didn't have much bite for the rich.  In the early 1900's, Congress finally approved changes that were intended to redistribute America's wealth to all of its citizens. That was a bit grandiose, considering that the super-wealthy were a lot smarter -- and much more aggressive -- than the average person. All they needed was a way to keep their wealth while appearing personally to have only normal assets.

Long Live The Robber Corporations!

What -- is it the tie? The coffee cup -- ?

Obscene personal wealth had become embarrassing -- and highly taxed. But corporate taxes were low, and corporations could hold infinite wealth.  So began a movement that has forever changed the landscape of American wealth. Here are some of America's "new" robber barons -- err, corporations:

Corporate Name Value of All Stock ("Market Cap") Monopolized Industries
Google $400 billion 90% of search engine traffic.
Microsoft $355 billion 85% of personal computer operating systems worldwide.
Facebook $188 billion More than half of all personal social networking.
Amazon $144 billion Completely dominant in online retail sales. Three times the size of its next online competitor, Apple.
eBay $65 billion Virtual lock on online auctions.

These companies have done the same basic thing: they have formed illegal monopolies by -- according to them -- becoming too good at what they do.  Miraculously, their competitors have uniformly failed at this!

But the members of this list have something else in common: they're not really worth anything.  Sure, they have patents that their attorneys declare to be worth billions of dollars.  But what else is there, really? A bunch of desks in offices? People who work there, but could easily go elsewhere? They sell a lot, sure, but any of these firms could be replaced within a decade without so much as a whimper. Impossible, you say?  Does anyone remember MySpace? It was founded in 2003!

Google epitomizes the birth of the Virtual Corporation:

  • It has ideas, but those change so often that it is impossible to determine their value. Importantly, the ideas can easily be replaced by new ideas from an unexpected competitor.  This is called a "Black Swan" event (per Nick Taleb).
  • It doesn't really perform a service, and doesn't genuinely offer a product. Everything it sells is "virtual" in that it appeared instantly and can disappear just as fast. Microsoft is the only exception here, with some hardware in its portfolio. But its recent operating systems, Windows 8x, have failed to impress consumers. Its so-called market share is largely made up of users in third-world countries who are still sucking the vapors out of Windows XP, probably without even paying for it.
  • The "market cap" (total stock value) is ridiculously high in comparison to normal companies. Amazon is the worst offender, often showing corporate losses. Strangely, this confuses the issue of what the market cap vs. net profit calculations looks like, since that would be a negative number! Again, innumeracy wins the day. Just remember: a bubble is always followed by a burst. So these companies are doomed to failure by their own excesses.
  • It can launch, rise to stardom, and fail completely within ten years.

How To Fix The Internet (And Revoke the Google-Net)

I've given Google the hardest time because it has more real influence over our lives than any of the other companies listed. It's huge -- and growing. If we change Google, everything else should fall into place. The steps are:

Protect Our Privacy

Now if I can just find 17 janitors to help me shut the danged door...

  • Whenever I search for something on Google, and then go and do something else, I immediately begin noticing tiny ads for the thing I was looking at previously. I am sure Google thinks this is helpful, but what it really represents is an inexcusable arrogance and indifference about my privacy. Most of the "deals" I am offered are actually over-priced.  Somebody has to pay for the advertising that I didn't ask for -- so I guess I am elected! This has to stop.
  • Congress should pass a bill that no search engine can produce advertising or false search results without the user's explicit permission, and that this permission shall always be denied by default. Also, it should be affordable to shut off all advertising permanently, perhaps through an annual fee not to exceed $50. That's four bucks a month for your sanity back. How much was this month's cable bill? The Internet is a hundred times more useful. We should pay for a clean Internet and then demand it from Google.
  • All Internet browsers (and this technology may change) should operate only in cloaked mode, in which nothing about the user is revealed. All search history should be encrypted and private and easily deleted. This should not be optional.
  • No web page should be allowed to pop up any new window without the user's explicit permission, which should be denied by default. No exceptions!
  • Any web page that offends the user or invades their privacy should be easily marked and reported by the user. This should be investigated and cured immediately, and must be Google's direct responsibility. They should be legally liable for damages to users if they fail to enforce the new privacy rules.
Make It Easy To Find Things

Doohhh... I KNEW I should have alphabetized these books!

  • The big change here is that your world needs to be organized based on who you are and what others like you are doing. If it's afternoon and you're driving your kid back to school, and you happen to pass within a short distance of your dry cleaner, and there are clothes waiting for you, your computer should tell you so, and guide you right to the door -- even if it's hidden inside a shopping complex. The things you do tell a lot about you. As you go through your life the computer should learn about your needs and wants.
  • What if you are a thief, trying to take advantage of this sort of profiling to find victims? You must first be identified firmly and beyond doubt. There a number of solutions for this, and they are slightly inconvenient. But they should only need to be done once per person. Once completed, any personal information about you -- especially your name, social security number, driver's license, home address, etc., will be permanently erased from the Internet's servers. The goal is to know all about you but only in ways that don't clearly identify you or allow the system to be used against you.
  • Once identified, you will have a number issued by an independent agency that everyone, including Google, will use for you. They will never again know even the simplest things about you. But they will be invited to learn about what you do and like, and how you spend your time. You should be unafraid, as extreme measures have been taken to protect your privacy. The more you share your life with your computer, the better that device will be at helping you. Computers should build shopping lists for you, and then route you from place to place, at any time of day or night, based on the best prices for the goods you buy. You should never over-pay for anything again.
  • Once your profile matures, Google should be able to identify your peer group. This is anyone anywhere that shares certain key beliefs, attitudes, or habits with you. Their personal favorite foods, etc. should be offered to you when you are hungry. It doesn't mean you have to obey thesis guidance. It's just that the odds are very high that you will both enjoy the experience, but find something new as well. Your invisible, unknown peer group will help you to grow and expand your horizons. Similarly, you will affect them with your own passionate opinions and actions.
  • In ten or twenty years, the concept of searching for something should feel like a complete waste of time.
Dump The Idiot Web Browser
What's Wrong

This web browser is acting funny...

  • At the computer age's Big Bang, the web browser was provided for free so users could have a peek at the wonders of the high-tech world. New versions of the browser arose, but none was significantly different from the other. They were toys. Computer users had grown accustomed to running desktop applications, which were much faster, more customizable, safer, and more powerful. Then a funny thing happened.  Amateurs began developing web pages that relied on the browser. JavaScript was released, allowing these hacks to throw together pages with animated features. Web development became a big business, and anyone could do it. And it was fast. So the browser became the only way to work on the web. Unfortunately, the "user experience" got lost in the shuffle. As long as people could eventually stumble onto what they needed, and the tools were free, nobody cared. That was a big mistake.
  • The web browser was intended to be a cross-platform compatible for Apple, Windows, and Linux operating systems. That is disingenuous. Indeed, each of the big three OS's now have all of the major web browsers. But the browser itself is not in any way compatible with other operating systems. Just try installing Internet Explorer for Mac on an IBM PC. The actual application is hard-coded for the operating system. So each browser company must release a specialized version for each of the Big Three OS's. That's a plain fact.
  • The other part of "cross-platform compatibility" is consistency of user experience. This is also a fallacy. Users are free to run any version of Mozilla Firefox they wish, including one that is five years old and that barely functions under Windows 8. Users can turn off JavaScript, which is so prevalent nowadays that removing it hoses most web pages. There are hundreds of variations of browsers based on their version and internal settings. The browser is, without exaggeration, the least compatible piece of software ever released.
  • Web browsers are also the least secure piece of software ever released. There are so many ways for hackers to trick the browser that the security suites can't keep up. The web is a very dangerous place to be, and that is due to the browser itself. It is only going to get worse.
  • The Internet is full of junk that is not helpful to the consumer, and is generally used to manipulate search engine rankings. There is no coordination to centralize content and get rid of redundancy. There is no real clean-up system for junk in general.
  • The Back Button. Imagine throwing a huge party for your daughter's Quinceañera (15th birthday). All of your family and friends are there. Everyone wants to toast this and toast that. You drink way too much. Some old feelings bubble up, and you begin insulting people. You and your best friend get into a fist-fight. Others join in. You end up in the parking lot, with everyone screaming, streaming in blood and urine, getting your face based into the filthy gravel. Your heart stops; you have a stroke. But miraculously, you wake up in the hospital, heavily medicated but otherwise fully alive. Your regret what happened. You want to return and make things better. So you hit The Back Button. Bonus points quiz: where do you end up? At the party or in the parking lot? You know the answer because every day, at least once, you fill out some moronic web form and accidentally click The Back Button. Whoosh! Everything is gone and cannot be recovered.
  • The Internet and its awful web browsers exist in what is called a stateless environment. That's because the original server technology behind the web has not significantly changed in 20 years. The server just passes requests back and forth. It doesn't have any idea of how to store what you're doing, and doesn't try. Software developers have struggled to find a solution, but have never challenged the nature of the server itself: a big, strong, fast work-horse with no brain. That can change, but only through public pressure. Most of the pain you feel in your web experience is due to this antiquated infrastructure of stale ideas.
  • It's way too hard for users to actually use the Internet. How many people reading this article are comfortable with rewriting their web page whenever they want to post a few revisions? Nobody does. Maybe your teenage son can do it, but ends up playing video games instead. The pseudo-expertise required to create web pages is an example of stupid technology: it's just hard enough so most users can't figure it out, but it's not made up of anything robust, reliable, scalable, or manageable. So web developers make a killing "maintaining" web sites that the user cannot manage themselves. As a result, false middle-men like Facebook have arisen to "fill the void". But they accomplish that by forcing you to live by their rules, and sucking the life out of you with constant invasions of your privacy.
What Must Be Done About It

If I could speak, I would say some things that would make me want to stop speaking!

  • Since all browsers are basically desktop applications, with a unique version for each OS, there is no reason the developers cannot just use a real desktop application for this purpose. It would be vastly more powerful than a browser. But they would have to give up HTML, JavaScript and other toy-boat scripting languages. That's not much of a loss. These languages encourage bad programming practices anyway. The new browser will be completely stateful, remembering everything you do and losing nothing. It will be customizable so you can make it look and behave any way you want. It won't have -- or need -- a "Back" button. It will provide powerful organization for you and your peer groups. It will interface with your car, your phone and even your home.
  • Web servers must be reinvented to provide hosting for Unified Documents rather than web pages. An open-source, world-wide standard must be created for what a Unified Document is. No major corporation will control this definition. The new type of document will support text, tweets, emails, sound, video, and even databases. Users can then publish any document to the "Internet" that they wish. The editors for creating these new documents shall be extremely easy to use. Scripting languages will be constrained so they are not essential to how the documents are fetched or presented. Documents will be publishable for any number of persons specifically or for the public in general. You should be able to get your "new" Unified Document published on the Internet with a single click of a button.
  • The Internet should be built to understand content so clearly that it knows how to prevent duplicate content -- including parts of documents, which might contain parts of other documents. There should only be one copy of anything -- ever -- on the Internet. All other references should be "linked" so they refer back to the single source of truth for that document.
  • Users should be able to instantly notify the Internet about "bad content" with the click of a button -- advertising, computer-generated nonsense, etc. Since users are certified, they can't illegally vote. Their opinion should drive the Internet's own clean-up system to remove any form of malicious content from the web.
  • Advertising will virtually disappear. Indeed, if a user pays a $50 annual fee, advertising will be illegal. If a user wants to buy something, they will consult their peer group to find out what real person has bought the item before, and at what price. All goods will be offered with explicit and complete details, so they can be compared for price vs. value. No person shall ever offer anything on the Internet who has not been identified per the rules of the web. So no more scams.
  • Email will be eliminated. Instead, all communication will be between known individuals and with complete permission of the recipient. The communications themselves will be Unified Documents, as described above.
  • Violators of Internet etiquette will be quickly cut out of the system and their servers banned. Fines and prison terms will be provided for chronic abusers.

By Stephen Marcus

What Is a Program?

I think ... it's hungry!

We all rely on technology for each instant of our lives. But we are also threatened by this new and necessary magic because we don't really understand it. Most of the devices we use are either broken, mis-configured, or just badly designed. But we can't do anything about that, either, because we do not participate in the process that creates them. We can get out of that trap by learning more about these "toys". Let's begin with the computer itself:

"Toaster"... hmm... does that mean I can dry my fingernails, too? It couldn't hurt to try!

A computer is as dumb as a toaster. In fact, an old-style toaster is much smarter than a computer will ever be. A toaster could tell if a piece of bread was cooked because at a certain temperature, a sliver of highly sensitive metal would curl and touch another piece of metal, causing electricity to flow through. That fired the eject button on the toaster, and POP, out flew the toast, hopefully onto your plate! A computer could never do that, at least not without a lot of help. All a computer can do on a hardware level is to tell the difference between 0 and 1. If you look into your back yard, you can see a large version of the hardware that exists inside the central processor of any computer: a simple gate. If it is open, the computer reads 0, or false. If the gate is closed, the computer sees this as 1, or true. The problem with this is that there there are actually three decisions that should be made if the computer is to rely on the gate as an analogy of anything and everything:

  • The gate is open because it has been opened.
  • The gate is closed because it has been closed.
  • The gate is unset because no one and nothing has decided to either open or close it.

So can a computer determine this at a hardware level? No! It is a binary system, so must assume that when it a gate is open, it was left that way on purpose. This does not reflect reality in any way. But it's where we began, and nothing has been done about it since. So a computer is defective even by the simple standard that we have cited here.

So how does a computer do anything? The answer is: software. Some kinds of software help the computer be a better physical device. The software that you encounter the most is in the form or software programs that are written to allow you to interact with a computer, and for the computer to pretend to be smart. If it sounds complicated, it is. But like anything if you can learn about something fundamentally, you can build on that over time. So let's begin be defining what a software program is:

A

program

is an

automation

process in which highly organized

variables

interact through their

relationships

with each other and through

events.

 

There's also one key addendum to help set us on the right course when creating a software program:

The program shall be small.

 

Philosophical statements can be hard to digest. Take, for instance, the single statement that describes the United States' government:

This is a libertarian society based on individual rights.

.. and for the addendum, we would refer to the Bill of Rights, which sets out clear rules that we must never break because that would violate the core philosophy.

Still, if you grew up in another country and migrated here, and had to take a Civics test to get a citizenship, you would find yourself repeating the words like a parrot, hoping for a peanut as a reward. Does that mean you grasp them? Philosophy is difficult to teach, so must be instilled from birth. That's why it is so difficult for foreigners to feel at home here, at least until they become "one of us" -- if that ever occurs. And all stemming from a philosophy that they never experienced until they stumbled off the boat in New York.

Most Americans are immigrants when it comes to technology. We just "don't get it". So we give up. We never "belong" to the new world of technology. We just visit there when we have to. This must change. This article is dedicated to the notion that everyone can learn about technology and with a little effort, gain more control of the technology that affects their daily lives. The reason I begin with the statement of Programming Philosophy -- describing "What is a Software Program" -- is to underline the importance of fundamentals in how we create things. If we respect design and plan what we do, we can achieve great things. If we ignore this process, we make a mess. Today's programs are mostly a mess. Let's see why.

The "Bad Old Days" of Linear Programming

I got curious about programming after buying my first IBM PC in the early 1980's.  The operating system was quite spare.  I used Microsoft Word and Excel early on.  I wanted to make those programs work better for my business.  So I learned about a language called VBA ("Visual Basic for Applications") that ran inside the Microsoft Office suite.  This language was primitive, but for a beginner, it already felt like I was taking real control over the program's behavior.  For instance, I over-rode the print dialog to add a few extra buttons that allowed me to do things "my way".

In 1985, I asked my brother -- a trained professional programmer, with a Master's Degree in Math and Computer Science -- to write a program that would run my small wholesale business.  He was busy, so only grudgingly agreed.  I waited and waited.  After a few months, he showed me the grand results: a window opened on the computer screen with a few labels and text.  I slumped.  "Is that it?" I gasped, not realizing just how much time it had taken him to accomplish this feat.  My brother could see that I didn't know what was actually involved in creating a program, especially for the time we were  living in.   So he invited me to his house in December, 1985, and sat me down in front of his computer.  This time, all I could see was a text editor with some weird-looking words and symbols filling its screen.  "What's that?" I asked.  My brother grinned.  "That's source code.  You're going to write your own program".

This was first day as a programmer.  I committed myself to learning how to create applications that would make my computer more efficient for users.  I have been on this course ever since.

For the next ten years, I developed an ever-more-powerful program using a language called Turbo Pascal.  At that time this was a great way to create a quick application.  Eventually the program swelled to over 100,000 lines.  It ran my entire wholesale business.

Now that I look back on my first decade as a programmer, I can see that it was not going to prove robust or lasting.  I was engaging in a flat, simplistic sort of thinking.  But so was the rest of the programming world.

What Is Linear Programming?

Linear programming is taking a "straight line" from the start of a thing to its perceived end, made up of steps.  For example, an old MS-DOS batch file:

ECHO OFF

CLS

:MENU

ECHO ...............................................

ECHO PRESS 1, 2 OR 3 to select your task, or 4 to EXIT.

ECHO ...............................................

ECHO 1 - Open Notepad

ECHO 2 - Open Calculator

ECHO 3 - Open Notepad AND Calculator

ECHO 4 - EXIT

SET /P M=Type 1, 2, 3, or 4 then press ENTER:

IF %M%==1 GOTO NOTE

IF %M%==2 GOTO CALC

IF %M%==3 GOTO BOTH

IF %M%==4 GOTO EOF

:NOTE

cd %windir%\system32\notepad.exe

start notepad.exe

GOTO MENU

:CALC

cd %windir%\system32\calc.exe

start calc.exe

GOTO MENU

:BOTH

cd %windir%\system32\notepad.exe

start notepad.exe

cd %windir%\system32\calc.exe

start calc.exe

GOTO MENU

This generates an on-screen menu. Pretty handy, and can be created in moments.  So why not just build all of our programs like this?  Let's have a closer look.

 

This is fairly clunky.  What happens if instead of typing the correct language:

SET /P M=Type 1, 2, 3, or 4 then press ENTER:

... you type:

SET P M=Type 1, 2, 3, or 4 then press ENTER:

The batch file crashes and so no menu appears.  All for the lack of a slash.  Imagine if this batch file created the main menu of your program.  The user would not be able to use the program at all. So linear coding is fragile.  It is easily broken.

Also, why weren't you warned about the error?  There's no good editor for this type of code that will warn about errors.  Even in more advanced linear languages, such as VB Script, Pascal, JavaScript, etc., perhaps 90% of all possible errors are simply ignored.  The editor can't help.  Linear programs do not provide adequate error checkingThey are extremely difficult to debug.  In very large programs, you may need to watch them line-by-line just to find a single tiny error.  The program would take forever to develop.  I know this because that's how we debugged early programs.  We don't want to return to this tortuous practice.

Why are these two pairs of lines repeated?

cd %windir%\system32\notepad.exe

start notepad.exe

 

cd %windir%\system32\calc.exe

start calc.exe

... and if they are repeated, how do I make sure that each time they are written, they appear exactly in the same way?  Also, what happens if I get something out of order during that process?  This is one of the biggest risks in programming in general: redundancy.  Once we have copied and pasted something from one place to another, we cannot guarantee that each version will always be the same.  Maybe someone will come along and change one of these lines (but not both) as follows:

cd c:\windows16\system32\notepad.exe

start notepad.exe

.. and for a while, it works.  But then a new version of windows comes along and it breaks.  That's because under the new Windows, the notepad is located at:

c:\windows32\system32\notepad.exe

So the first of the two redundant code pieces works, because it uses a system variable to find the windows directory:

cd %windir%\system32\notepad.exe

... and the second one now fails because it doesn't.  But this is where programmers go wrong again.  Many beginners will actually repair both lines to match each other, so they both now say:

cd c:\windows32\system32\notepad.exe

.. which actually works.  Until the next version of Windows, when the operating may move the notepad to some new location. The program was "fixed" through hard-coded linear thinking: that if we test something, it is good coding.  But it failed at a design level because it was not robust over time.

So another weakness of the linear approach is that it is inherently redundant and almost impossible to keep up-to-date.

Let's have a look at a VBA snippet that creates a new worksheet:

Sub AddNew()

Set NewBook = Workbooks.Add

With NewBook

.Title = "All Sales"

.Subject = "Sales"

.SaveAs Filename:="Allsales.xls"

End With

End Sub

Set NewBook = Workbooks.Add

Does that mean NewBook is an actual workbook?  You'll say to yourself: of course it is!  How else could you create it?  Well this is why nobody writes VBA anymore.  The answer is not intuitive.  We could actually substitute this for the code above:

Sub AddNew()

Set NewBook = 1

End Sub

Now NewBook is an integer - ? - !  How is that possible?  In VBA, a variable is "anything".  It is called an Object.  Once it is assigned, it takes on the shape of the thing that it was assigned to.  But it can be reassigned at any time to anything else.  This makes VBA quite easy to code for beginners, but extremely hazardous to long-term safety and manageability.

For instance, what happens here?

Sub AddNew()

Set NewBook = 1

Set NewTree = "tree"

Set Explosion = NewBook * NewTree

End Sub

NewBook is an integer.  How can that be multiplied by the string, "tree"?  It can't.  So the program halts where these two dissimilar elements are mixed together.  The editor doesn't tell you this in advance.  You have to find out at "runtime" -- when the program is running.  So these old linear editors are deeply flawed.

Programming professionals call this principle Type Safety.  Linear languages generally lack this key feature.  They are hopelessly vague.

Procedural Programming Languages

A procedural language is one that is linear in its structure, just like a batch file.  But it provides a few new tricks to make coding easier.  The main one is the ability to create a procedure that can then be called redundantly. You can actually fake this effect in a batch file. For instance, runCopy.bat:

xcopy %1 %2

… which you can call directly from another batch file, as long as they are pathed together and can see each other.  Here’s the second batch file, called CopyMyDocs.bat:

call runCopy “O:\MyDocs\*.*” “Q:\MyDocsBackUp\”

The term Call means to call the batch file. The two quoted strings are the two parameters which are received inside runCopy.bat as %1 and %2.

The Procedural Summit: C

With the development of C in the 1960's, procedural programming reached a high water mark.  Powerful, complex programs could be created that ran at lightning speed.  The enormous power of C came at an extraordinary price: C was not easy to code, and tended to be buggy and difficult to control.

Summary

The "bad old days" of linear, procedural programming provided:

Pro's:
  • Relatively quick to develop.
  • In many cases, easy to write with a beginner's skills.
  • Could be developed and used on-the-fly and often as a part of other applications (such as VBA inside Excel, etc.).
  • In the case of C, could be extremely powerful and amazingly fast -- though required expert skills.
Con's:
  • Linear methodology did not reflect how things occurred in real life scenarios.
  • Virtually no type safety.  Anything could be anything.  No predictability.  Poor contracts between parts of the program. The code itself was parsed at compile time, and much of it was late bound, meaning that the actual type was not known until the program actually ran.  Many types of errors arose out of this loose construct.
  • Redundant and verbose, so impossible to version or debug.
  • Mostly weak editors would let you hang yourself while coding and find out later when the user was trying to run your program.
  • Poor error-handling; most problems just crashed the program.
  • No event-driven model, so no interaction between elements based on competing interests, as in real life.

The Arrival of Object Oriented "Behavioral" Languages

The high-water mark in the development of object-oriented programs came when slow-to-the-party Microsoft, after watching 16 years C++ and Java, decided to release its opus, a language called C#, which over the next decade, would bring extraordinary benefits to the programmer. C# filled the void in the need for a powerful, easy-to-learn-and-use language for the creation of applications.  It restored faith that Microsoft was the world leader in programming languages, if not software development (sigh)(Note: an “application” is a file with an .exe suffix that you run from your computer.  It does not reside inside a web browser).

C#’s only major weakness was speed.  It still waddled along behind the hyper-fast C++ in real-time benchmarks.  This is because C#, like Java, opted to produce a byte code that the operating system would run as a separate layer.  The theory was that the compiled program would be independent of any given operating system.  But in 11 years since, no company has been capable of writing an adapter to allow .NET to run completely and successfully inside a foreign OS.  So the decision for byte code was grandiose and unnecessary.  In C++, the entire output is already “pre-digested” and ready for the operating system.  That’s why it’s so much faster.  Microsoft never corrected the issue; even today, C# is hobbled by this approach.

Why Is Object Oriented Coding Better Than Old-Style Linear Scripting?

In "object-oriented" or "behavioral" programming, instead of trying to tell the program what to do, we create "objects" that behave much like human beings. They have their own information the store (as in our brains) and they also have things they do (behaviors), personal relationships (class derivation) and a means of communicating with each other (events).

Of course, this doesn't guarantee that a programmer will always implement this concept properly. That is the art and science of professional coding. For instance, here are two ways to do the same thing: make animal noises based on some inputs.  I’ll use C# as the language.  Strangely, you can write both good and bad code inside C#, as with any editor/compiler.  This is why we need to train programmers to think in a new way about what they do. Object coding is like chess; linear coding is like checkers.  So first, the quick (wrong!) way, which is how 99% of the world’s so-called programmers would do it:

using System;

namespace AnimalSoundApp

{

  public class Program

  {

     private static void MakeAnimalSound(string[] animals)

     {

        // Check to see if there are any animals implicitly below (for safety).

        foreach (var animal in animals ?? new string[0])

        {

           // Check for each animal by its name and issue the proper sound.

           switch (animal)

           {

              case "cat": Console.WriteLine("Meeeooowww...");

              break;

              case"dog": Console.WriteLine("Bark! Bark!");

              break;

              case"bird": Console.WriteLine("Chirp! Chirp!");

              break;

              default: Console.WriteLine("{Shhhh…}");

              break;

           }

        }

     }

     public static void Main(string[] args)

     {

        MakeAnimalSound(new [] {"cat", "dog", "bird"});

     }

  }

}

The console output looks like this.

Upon questioning, these so-called "programmers" would say:

  • They did what they were asked to do: make animal sounds based on an input.
  • They tested it.  Works every time!
  • They wrote it in 10 minutes.  What a savings!  At this rate, they can write a million lines of code in their first year.  What a value!

But under close analysis, all sorts of issues pop up. For instance, what happens if we misspell something?  Maybe just once, like here:

     public static void Main(string[] args)

     {

        MakeAnimalSound(new [] {"ct", "dog", "bird"});

     }

The output would look like this:

That’s incorrect. The first line should have read, “Meeeooowww...”.  But the program couldn’t find a “cat” because the input was misspelled as “ct”.

No programmer ever thinks they can make a mistake like this.  But in every program I have ever reviewed, I have found lots of them.

What if we have to create a lot of animals?  Maybe thousands?  And what if they have hundreds of different behaviors, like how they move, and all of that has to be reflected in the program?  Is that an exaggerated expectation?  Modern programs must manage large enterprises.  You can see that this would create a huge mess.

The code is also quite redundant.  We define that as any same or similar code in the program that could easily be rewritten to prevent the redundancy.

The code is linear: crude and unsophisticated in its approach to the problem.

The Object-Oriented Solution

As an Object-Oriented Programmer (“OOP”), the first thing I ask about a new assignment is, “how can I best explore, comprehend and act upon all of the potential behaviors in this application?”

  • Don’t cats and dogs have an in-bred relationship?  Especially when they don’t know each other?  Dogs tend to fight with cats.  The same applies to cats and birds.  Cats pursue – and eat -- birds.
  • If any of these critters got into a fight, wouldn’t they make different sounds than if they were unconcerned?
  • The program should remain uninvolved in the animals’ interactions.  All it should provide is an open yard where the animals interact according to their innate behavior.
  • Finally, the program should be fun and easy to use.  So the user must be involved in what happens.

Here is what I came up with.  I’ll run you through the screen shots.  Let’s see if you start getting a sense of what we mean by behavioral programming:

The program has only one screen.  It’s a “yard” with three animals: a dog, a cat and a bird.  Notice the circles around the animals. Those represent the distance they can see (outer blue) and the distance at which they feel confronted, or confront others – their “fighting range” (red).  These circles are different sizes because each animal has unique capabilities.

OOP_1

Here’s the "user" part of the program.  With the mouse, left-click on the cat and drag it closer to the bird.  The bird reacts first because it has a wider “sight” range.  It sees a superior enemy.

Then the cat reacts, as the bird enters its sight range. The bird is an inferior enemy / prey.  So the cat starts getting aggressive.

As the cat gets into the bird’s “red” fighting range, the bird reacts with a new sound.

Finally, the cat gets the bird within its attack range, and makes its own new sound.

Now let’s bring the dog into the picture.  As the dog approaches and gets the cat into its sight range, the dog sees an inferior enemy, so becomes more aggressive.

The cat quickly perceives this change, as the dog falls into its sight range.  The cat now sees a superior enemy.  This “trumps” the cat’s preoccupation with the bird.  So the cat makes fear-based sounds.

The dog gets the cat into its fighting range and is ready to fight, so makes its angriest sound.

The cat sees the dog in its close range and forgets about the bird entirely, making its own fighting sound.

Conclusion: Linear vs. Behavioral Programming

Linear Object-Oriented
Related Terminology Scripting, Distributed Programming (writing code everywhere rather than I one place and then sharing it).  Late-Bound (no type safety). Behavioral Programing, Interface Programming, Centralized (Common) Code, Compile Time Type Safety, Multi-Dimensional Approach (think of “3-D chess”).
Methodology “Telling the program what to do” – Rigidly ordered, consecutive statements made up of hard-coded primitives. Classes define behaviors.  Interfaces declare class interactions.  Events notify classes about other classes.  Classes work independently from each other, mirroring real-life behaviors.
Mind-Set, Massively Over-Simplified How can I give more instructions so the program will do more things? What is it?  The program is an illusion created by independent interests, all of which know what they are.
Attitude More is better.  Work is rewarded on how many lines of code a programmer can produce that can pass a unit test. “The program shall be small”.  Work is rewarded when it is centralized, concise, complete and fully compliant with philosophy.
Mantra {Eastern Zen-Like Mission Statement} If I copy the last guy’s code, I’ll fit in and won’t get into trouble. The program shall always work perfectly.  If it fails to work perfectly, then we will find the problem in a centralized location and will fix it.  After that, the program shall always work perfectly.
Scalability Without concepts and organization, the code is a tightly-bound series of instructions that are hard to alter except by throwing everything away and starting all over again. Although intensely analyzed and organized, the class/interface paradigm can grow in virtually any direction over time, without major effort.
Confusing Nautical metaphor Over-steered.  Each instruction is mandatory, but at the same time excessive, robbing the program of its “oxygen”. Slightly under-steered.  Classes possess their own behaviors.  Theoretically, we don’t actually know all of the things a program might do.
Strengths Quick to develop.  Instant gratification.  Low skill set required for basic coding. Extremely well-organized and manageable.  Almost zero redundancy.  Easily changeable, debuggable and scalable.  Very predictable results.
Weaknesses Easily broken; light-weight and not able to handle complex scenarios; code must grow at the same rate as the program’s requirements; hard to fix; almost impossible to scale elegantly; difficult to learn someone else’s work, as it is convoluted, verbose and messy; horridly redundant, with each area of duplicated code having its own logic, but inadvertently; unpredictable results. Design and architecture take 3-6 months for most medium-sized programs, with very little to show for it.  Difficult for amateur programmers to learn, since it is “three dimensional” vs. linear thinking.  Requires conceptual brain, which, to technical programmers, is like asking them to attend art class.
Can it Pass Intense Random Unit Testing? Not a chance.  The coders created slam-dunk unit test scenarios to “prove” their code. Every part of the code is as robust as the next.
Cost Cheap initially, then heavier as reality sets in, and unbelievably expensive over time, usually resulting in tossing the entire project. Expensive initially, and requiring very heavy analysis and planning.  Catches up with scripting at the half-way point, but with robust initial results.  Very cost-effective in the long term.
How to Determine if this Approach Is Being Used Loose primitives in the code.  Poorly declared classes that contain scripts.  No behaviors created or enforced.  Very large code base chock-full of redundant elements.  “Speed coding” is a 99% indicator. No redundancy.  Narrowly defined classes that only contain the code need to support what they “are”.  By file count expressed as percent share, the program will contain 15-25% interfaces, 15-25% abstract classes, and the rest as regular classes.  By line count, the abstract classes should contain 40-50% of the program code.
Who’s Using This Approach 99% of the world’s programmers, including those who think they are familiar with objects. Programmers who are obsessed with fundamentals, and have the discipline to enforce them.

I encourage you to download the ZIP file and run the "OOP" versions of the program (inside the zip file, \AnimalSoundApp\Program\AnimalSound_OOP.exe).  Important: The attachment is shown as AnimalSoundApp.txt because it is easier to download that way. Just copy it anywhere and rename it to AnimalSoundApp.zip so your ZIP program will be able to open it easily.

AnimalSoundApp

Notice how vastly different this source is versus the horrid batch-file script. It’s also open source.  Enjoy.

By Stephen Marcus

Get In Touch


Marcus Technical Services, Inc.

P.O. Box 972

Lake Forest, California 92609

marcus@marcusts.com

(760) 840-7714

Facebook Comments