2011-12-31

Seven swans-a-swimming

On the seventh day of Christmas, we should take a rest
from struggling and striving, remember we're blessed,
for blessings enrich us when handled with care
and with those who don't have them always be shared.


I shiver at the thought, not because of the weather. The winters in these parts are getting milder, but the cold, hard cash that drives too much of our thinking these days is more than enough reason to seek more warmth. But where can we find it?

It's hard to talk about money without talking about success. For too many people, the measure of success is a person's net value, and when it's about money, it soon ends up being about "me". Like spoiled children, we hear the cries of "Mine! That's mine! You can't have it. No, mine!" Spoiled children are sad enough. Spoiled adults are even worse.

We also like to think, and we're told often enough, that what we have are the rewards of our own efforts. I didn't realize that inheritance was such hard work, but I don't know everything, yet most of the super-rich these days didn't earn their fortunes themselves. There are folks who did "earn" theirs, say Bill Gates or Warren Buffet, but does anyone think that Bill Gates worked harder than they did, especially those aspiring to get into the rich club? I doubt it, and so there is an apparent discrepancy between working hard and getting ahead. What about luck? What about being at the right place at the right time? These aren't things you can plan for and they have a strong influence on the difference between success and failure. But, worst of all, we are led to believe that they did it all on their own. And this is one of the biggest untruths in the history of propaganda.

You don't have to believe me. Try Malcolm Gladwell's Outliers for a behind-the-scenes look at what is really happening: being in the right place at the right time is more important than anything, and practice is more important than talent. These individuals then tend to get more than their share of support from everyone else, making success more of a collective than solitary effort. This is borne out in biology and evolution. David Sloan Wilson argues quite convincingly in his 3 August 2011, New Scientist article on selfless evolution that cooperation, not competition, is the strategic, that is, long-term evolutionary mechanism. Humans survive best and longest if they cooperate. Competition is destructive, and this is certainly the effects that we are seeing from the latest economic catastrophe.

Put more succinctly, "we" is much more important than "me", at least when thought about in the long-run. Nature knows this, but we seem to have forgotten. Our forebears, though, were also forced by nature to come together, to huddle together in search of warmth and hope for the year to come. It was a part of their experience. We haven't outgrown the experience nor have we found a worthy substitute for it, but the Twelve Days of Christmas give us the chance to regain that experience once more.

2011-12-30

Six geese-a-laying

On the sixth day of Christmas, we should stop and reflect
on those who get lost or ignored through neglect
by well-meaning people, like you and like me
but they're there nonetheless, if we choose but to see.


If you've taken the time to give yesterday's question any serious thought, you will most likely have noticed something. It's hard to find a "why" in what you have or want to have ... more often than not we start thinking about what we want to be. This is a existentially fundamental distinction and it makes all the difference in the world. Erich Fromm, the émigré German psychoanalyst, wrote a book in 1942 which should be on everyone's reading list: To Have or To Be. It's not an academic volume of technobabble and overweening erudition, rather it is a simply written, human book that asks you to ask yourself the meaning question, too.

The question whether to have or to be has taken on renewed significance this year in particular as we have seen one country after another fall into extreme financial difficulties, as we witness the epidemic level of foreclosures being exercised by banks, as we struggle to escape the largest economic crisis in our lifetimes and which may be even worse – at least in certain regards – as the one that led to the Great Depression. Let's face it: we're not in very good shape and promising solutions are as good as nowhere to be found.

The result? A air of barbarism, I would say, and I don't think I'm exaggerating. If you want to know what someone thinks, just listen to their choice of words. It will tell you everything you need to know? Has it occurred to anyone besides me that the tenor of public dialogue has become harsher, crueler, more cynical, brash, aggressive and violent. Oh sure, we all know the effects of a good pepper-spraying or clubbing, but what about a well-executed tongue-lashing? You can wash the pepper spray out of your eyes and the cut on your head will heal, but the psychological wounds that are inflicted with words are some of the hardest to heal, and some never, ever heal properly.

Let's be honest, our entire public dialogue, be it in the public square or the presidential debates, is about one subject and one subject only: money. We only ever talk about money. The schools are collapsing because they have no money, and most people don't have money to send their children to other schools. Public services don't function without money and so the fire department watches a house burn down in Kentucky because the owner hadn't paid the subscription fee. Americans don't want to pay taxes because they don't know how the money is spent ... well, maybe building bridges to nowhere but certainly not fixing the ones that are in need of repair. If you step back and listen, I would be willing to bet that within the first 30 seconds of any conversation, debate or argument about any public issue, the money argument will arise: how are we going to pay for it ... or even better, who is going to pay for it.

Unfortunately, it is all only about the money anymore.

2011-12-29

Five gold rings

On the fifth day of Christmas, the time just got away.
Instead of deep reflection it was dash and rush and pay.
But, with a bit of effort and help from all around
we made it back quite safely where peace and love abound.


Let's pick up on the theme we started on the Third Day of Christmas, meaning? The question we asked then is an important one, much more important than we like to think: what gives meaning to our lives? This is not a trivial question, regardless of how simply it is phrased. The German philosopher Friedrich Nietzsche (1844-1900) wrote once that knowing why one lived allowed on to endure almost any how. This rather simply stated insight led the Austrian psychiatrist and psychoanalyst Viktor Frankl (1905-1997), a survivor of the Nazi death camps, to use this as the basis for his highly successful logotherapy.

Meaning matters. What Nietzsche intuited, Frankl experience, and this under the extremist of circumstances. Meaning made the difference in untold number of instances between life and death, and while it may not be as existentially stark now as it was in Frankl's life, the rules of the life game haven't changed that much. So, the question that springs to my mind is how many of us have actually sat down and thought seriously about what gives meaning to our lives. If you never had, it's long overdue. If you have, is it the same as it once was? And keeping in tune with the Spirit of the Season, it is precisely this time of year ... the time between the years as it is ... that we should find a few quiet moments to reflect on just that. What gives meaning to our lives?

This isn't about New Year's resolutions that we never keep. Why should we? We make them because we think they'll help us become a better or healthier or happier person, but do we really know that that means? Of course we don't keep them because they are little more than add-on rules that simply get in the way, another set of obligations that are imposed upon much like the perceived unreasonable demands we get from our bosses or spouses or children or ... . Yes, I believe that most of us are so wrapped up in the details of our lives that we are not as clear on what is meaningful in our lives as we should be.

Meaning's a big deal. It's the answer to the "why" question. And "why" is always a big deal. So, do we work to live or do we live to work? Do we live for our children or merely through them? Do we live for the things that we have and own? Do we live for what we learn? It should be clear that I'm not asking why we get up and go to work everyday (or not, if we don't have a job), I'm asking if you know why you are even alive to begin with? It's a much more serious question.

The answer to that question is the gold – not brass – ring on the carousel of life.

2011-12-28

Four colly birds

On the fourth day of Christmas, we should take the time
to think of those others whose lives aren't as fine
as ours, in their fullness, their richness and glee,
for much suffering results from unintentional greed.


Before pursuing the thread of meaning which was introduced yesterday, I'd like to stop for a moment and reflect upon the relationship between light and spirituality that was mentioned as well. For as long as I can remember, and I'm pretty sure for as long as anyone can remember, there has been an intimate connection between Light and Spirit, between Light and whatever it is we conceive a God-like entity to be. Prior to monotheism, the sun literally played the light role. Within the monotheistic traditions, God was associated with light. What is more, even in the non-theistic religions, such as Buddhism, candles and light play no insignificant role. Why might that be?

Regardless of any other associations we may be carrying around in our heads, anyone who has ever taken a closer look at light itself cannot help but be amazed. OK, a bit of stretch. We can't see light. If you look into a container full of only light, all you will see is darkness. What we see, literally, are those objects from which light is being reflected. We don't see light, we can only perceive its reflection. And for those of you that are now having thoughts of mirrors forming in your heads ... yes, we have to ask ourselves perhaps which side of the looking glass we're on.

Seriously, however, light may be the most fascinating thing in our universe. Nothing can accelerate faster (for you neutrino buffs ... Einstein only spoke of acceleration, not velocity), but even more amazing, it is the one thing that can apparently be two things at once. When speaking of light, we most often hear words like "ray" or "wave", for light does in fact behave like a wave, under certain circumstances. Strangely enough, though, light can also be found in particle form, as photons, as they are called. In other words, sometimes light is a wave, sometimes a particle. When is it one, and when is it the other? The answer, as is so often the case in life, is "it depends".

The situation plays a big part in it, of course, but at the "moment of truth", it appears that the light "knows" what it needs to be and expresses itself appropriately. As strange as that may sound, there is a strong case to be made for light "making the decision" itself. Expressed in simplest terms, light apparently shares certain properties of consciousness. I certainly don't expect you to take my word for it. A well-argued and well-documented presentation can be found in Arthur Young's Reflexive Universe. (For those of you who don't know of him, Young is the person who figured out how to make helicopters fly.) Could it be that light itself is the link between the physical and metaphysical? It's hard to say, but it certainly gives us something to think about at this reflective time of year.

2011-12-27

Three French hens

The third day of Christmas takes on a new air.
We're tempted to work but we must be aware
that this is the season to look deep inside
for hope and for peace in which we may abide.


The modern individual "knows" that myth is a dirty word. We use the word to describe anything, particularly statements and stories that we believe to be unabashedly made up, pure fictions. One of the goals of 19th century science was to demythologize everything, and we have to admit that if they have got us to use the term "myth" as a derogatory term, they did a pretty good job.

Nevertheless, we have seen over the past week that there is more to myth than just silly made up stories. The whole reason for having Christmas now, the reason that there has been some kind of celebration at this time of year for almost as long as their have been humans in northern climes is that something occurs that can only happen now. The sun "dies" and is "reborn" and through its "sacrifice" the rest of us can live. Overstated? Well, yes and no.

On the one hand, we have a very nice metaphorical description of what is happening cosmologically. (And for those of you interested in just how widespread this kind of mythological description of reality may be, I would suggest Santillana & Dechend's Hamlet's Mill.) Scientists and non-scientists, materialists and non-materialists alike agree that the days get shorter until 21 December, and afterward they start getting longer again (until they reach their longest point around 21 June and then they start getting shorter again). This is the yearly cycle and this cycle is such an intimate part of our lives that we give it absolutely no thought at all. What is more, because we never think about it, we don't really care about it. And, as we all know, what we don't care about, doesn't matter, and what doesn't matter is meaningless. In other words, what was once a central (or at least more prominent) part of our lives has become meaningless, have our lives in general become more meaningless?

This brings us to a very important question: what gives meaning to our lives? I know it is one we don’t think about very often, but that was also something that contributed to this special time of year. The sun is at its weakest, and we were, traditionally at least, weaker as well and in the, literally, least physical time of year. The long, cold, dark nights of winter were an excellent opportunity to reflect. The festival of light was an opportunity to strengthen ourselves innerly for the challenges which were soon coming again "out there" in the world. In other words, for a variety of reasons, physical, cosmological, and mythical, this was a spiritual (or at least, metaphysical) time of year. To paraphrase the late, great John Lennon, all we are saying, is give myth a chance.

2011-12-26

Two turtle doves

The second day of Christmas falls
here without stores and without malls.
Just family, friends, those we hold dear
especially close this time of year.


Today is a holiday in Europe, a real holiday, with closed stores and festive programs. It's called Boxing Day in the UK, but if you're interesting in why, you can either ask a Brit or just google it. I would prefer to pick up where I left off yesterday, and the day numbered two seems to be a good place to start. After all, we were talking about two theories weren't we?

Science is a fascinating way of looking at the world. It uncovers bizarre creatures at the bottom of the oceans, identifies viruses and bacteria that are harmful, it has provided us with great boons, like penicillin, and great banes, like nuclear energy (just think of the waste). But, we should keep in mind it is only one way of looking at the world. It does well with facts, but there is more to the world than facts.

What about art? That is also a valid way of seeing the world around us. The realism of a Rembrandt, the irrealism of a Picasso or Dali, the emotion of a sculpture by Michelangelo are all valid expressions of feelings, of something deeper within us that can be moved (either positively or negatively). Art can be enraging or inspiring or any range of feelings in-between. Aren't feelings a legitimate part of our make-up as human beings? I think they are.

And what about things that we simply have difficulty dealing with, such as psychic phenomena like telepathy or remote viewing, or even the existence of God? Science is not in a position to make a statement because it is beyond the realm of the five senses to which they have restricted itself. Art cannot address it because these phenomena have something almost physical and something non-physical about them, something that goes beyond feelings themselves.

In other words, the simple dichotomy we have made for ourselves – it's real or it's not – is not really adequate to the task. We have to allow ourselves to explore other realms with other methods if we are to truly understand who we are, why we are here, and what we should be doing with ourselves.

Our forebears, among other things, attempted to encapsulate this "more" of which I am speaking in their own expression of knowledge, in myths. We should be grateful to them for what they have bequeathed to us, even if we don't understand it anymore. We need to give myth (and metaphysics) another, a fair, chance. Who knows, maybe all of us will be the better for it. It presupposes, however, that we talk with one another, not just to – or worse, at – one another.

2011-12-25

A partridge in a pear tree

Best wishes for all at this time of year,
days of peace and full of cheer.
Some simple hope, a kindly smile
can make dark days the more worthwhile.


For those of you who are celebrating: Merry Christmas. Today is not only Christmas Day, it is the first day of Christmas as well.

Why today? If you recall from the time-before-last on the 21st, the sun reaches its lowest point (nadir) in the Ecliptic on it's yearly journey (from a geocentric perspective, of course). It will be recalled that it stays there for three days. In other words, on the third day (that is, the 24th) it starts its ascent, it is, if you'll excuse the metaphor, "born again" for the coming year. Births are beginnings, so it seems fitting that the festival of renewed light and life begin now.

Regardless of our reasoning, though, we would all agree that without the sun, there is no life on earth. Granted, there's a finely tuned relationship involved, but it is easy to understand why the sun has been celebrated as the Giver of Life through the ages. Yet there is another paradox involved. At this time, the Earth and Sun are closer to each other than at any other time during the year, but still, in the Northern Hemisphere, it is as cold as gets. This is because the earth's axis is titled about 23 deg. from upright, so at this time, the sun's rays only obliquely hit the Northern Hemisphere. In other words, the effect of the sun is qualitatively different – it's the weakest – than, say, at Midsummer when its effects are the strongest.

It is here that I start to feel a bit sorry for the materialists amongst us. Differences are differences, be they material or otherwise, but they only have one kind. Given that 75% of the mass of the universe is unaccounted for, even our strongly scientific friends indicate that there is apparently more to cosmological phenomena than meets the eye. Until the rise of empiricism and its blitzkrieg on our thinking, really around the time of the Enlightenment, we had another option. We had a physical realm, but we also had a metaphysical one as well. As we know, the latter was given a pretty bad rap and relegated to untouchable status, but it hasn't gone away. We should note that it the existence of the metaphysical was not disproven, it was simply argued away. That is, if you accept the materialist argumentation, there is no metaphysical, but you don't agree, to them, you are simply displaying intellectual weakness.

Let's face it though, it's not intellectual weakness at all. One theory says there is nothing more than the physical. Another theory says there is more, at least the physical and metaphysical. I would expect there would be a discussion and debate amongst the theorists, but what I mostly see is ignorance: one side simply ignoring the other. I believe we can do better than that.

2011-12-23

Hope in the darkness

The experience of dark and cold can call forth feeling of despair. Food was in short supply, perhaps one family or group didn't have enough to make it through these times. But those who had more, could and did share, for it was understood that life could be better if more, not fewer, survived. Fire brought warmth, but so did the huddling together around the fire, and so did the sharing. At worst, you knew you were not alone. At best, you had hope for the future.

These, too, are feelings we share even today. They express themselves perhaps in different ways, but the feelings are the same. Our forebears weren't children, they weren't child-like in their innocence and at bliss in their ignorance. It is hard to believe that there were Einsteins among the cavemen. Someone figured out how to handle and manage fire. Someone thought the wheel could be a good idea. Yet, these days I often sense just a trace of arrogance, of (unwarranted) superiority because the least among us know more than those folks could ever have dreamed of. Aren't we just grand? But are we? We may know how to download apps to our smart phone, but do we know how to share with and care for each other? We may have indoor plumbing and the latest microwave, but raise another person's spirits enough that they survive until spring? I'm not so sure.

While we moderns may think we prefer snazzy formulas and differential equations to encode our knowledge, for millennia this was done in stories, poems, and songs. The myths of the Ancients, whether we like it or not, encapsulated an extent and degree of knowledge – of nature, the universe, the movements of the planets, the cycle of the seasons, when to plant, when to harvest, when to celebrate, and how to hope. This was a knowledge meticulously collected, constructed and preserved because of it survival value. We like to think today that we don't need any of this any more, but a quick look around by a half-observant eye tells you that much of what we see is tenuous, fragile, hollow, insubstantial, and downright disgusting. Don't get me wrong, just as there were cavemen Einsteins, there were cavemen jerks and thieves and exploiters. What they bequeathed to us, however, was not the worst of themselves, but their best, that is, their myths. Why, because they are such cute stories. I doubt it. Rather perhaps that we might learn sooner what they simply learned too late: cooperation with each other and with the world around us is a sustainable strategy; competition only ever gets you short-term gains. It is simply too selfish for its own good.

Over the next couple of weeks, then, I'd like to revisit one of these mythical constructs to see if there is not something therein that we can cull out for today … and, it has more to do with each of us than it does with anything else.

2011-12-21

The longest night

This is a special time of year … not because some are waging war on Christmas, not because some are once again trying to debunk the holiday with pagan myths, and certainly not because it's the biggest shopping season of the year. This time of year is special in a very fundamental way, and it might be worthwhile to reflect on why.

It's not a coincidence that all of those pagan holidays, as well as Christmas and Hannukah all occur right now, for in one way or another they share something very special in common, namely Light. For those of us living in the Northern Hemisphere, the Winter Solstice marks the shortest day, and the longest night of the year. It is the time when the Sun – as seen from the Earth – halts over the Tropic of Capricorn (23 deg. S 26 min. longitude) for three days at the end of its journey to the South. After the solstice, it will appear to travel north again, the days will get longer and the nights shorter until it reaches its apogee on June 21 (the longest day and shortest night of the year). For those of you who are keeping track and like to be exact, the solstice will occur today at 16:19 GMT (or 10:19 am for you in NYC, and 7:19 am for our California friends).

Yes, it's dark in northern climes at this time of year. For where I live near Stuttgart, Germany, the sun rose today at 8:13 am (local time) and set at 4:27 in the afternoon. That's not much chance for much light at all, and we have it good. My friend in Bergen, Norway won't experience the sunrise until 9:44 am (local time) and it will have gone down by 3:28 pm. Blink, and you might miss it.

These, of course, are merely the facts, the pure, astronomical, verifiable facts. It is simply darker for a lot longer that it is light at this time of year. What we all experience, though, is the stillness, perhaps the sadness, the wish to maybe withdraw and to reflect throughout those long, dark hours. We feel it today, if we allow ourselves to. And our parents, grandparents, and their parents and grandparents, and theirs and ever on, further back, felt it as well. The phenomenon is the same, to be sure. How we choose to deal with it is quite different than it once was.

This is not to say that how we deal with it is better. Nor am I implying that how it used to be dealt with should be our way. Then was then and now is now, but we should be aware that our experience today is the experience our forebears had so many, many years ago. It is what binds us together over time. It is something that we share, and sharing is (or should be) a big part of this time of year.

2011-12-19

Thinking even more about learning

If we accept that Carr may be onto something, then we must ask ourselves what behaviors, what learning the Net supports. It would appear that it is largely behavioral: the successful following of a link, the Facebook 'thumbs-up', the signal tone indicating an incoming SMS … all of these are positive reinforcements that Pavlov and, in particular, Skinner could be proud of. This point needs to be explored in more depth, but we would be remiss to dismiss this all as the jaundiced view of a luddite. Carr's point is that we are, in fact, rewiring our brains, and that this rewiring favours certain mental functions and capacities. If he is right, our understanding of education may be in need of revising … or saving, depending on how one looks at it.

Roszak, as we saw, made the case that it was our revised notion of "information" that makes the Net possible, and it is this notion that is undermining the institution of education itself. Carr is indicating that there is a body of sound, scientific, neurophysiological evidence to support Roszak's contention. It would seem worthwhile, then, to at least devote part of the time searching the literature to establish a foundation upon which a position can be taken, for if Roszak and Carr are correct, we may be in the process of undoing our understanding of the notion of education without even realizing it. We need to make a distinction, of course, between what the Net does best and what the Net can be used for. A shoe is not a hammer, but in a pinch it can be used to pound in a nail. A hammer may be better for pounding in nails, but if the window is open and it is breezy outside, it functions very well as a paperweight. In other words, it is not the tool in and of itself that determines its "best" use, rather it is the situation, the context, and the intent that best determines what is best. I believe we need to take a similar approach to the Net and technology-enhanced learning.

In terms of design, what works may be a reasonable enough approach. What is optimal may not be a matter of absolutivity but of relativity. In design, awareness of consequences and side-effects may be as important as knowledge about design itself or about the technology platform on which the learning is supposed to take place.

References
Carr, N. (2010) The Shallows: What the Internet is Doing to our Brains, New York & London, W.W. Norton and Company.

Roszak, T. (1986) The Cult of Information: The Folklore of Computers and the True Art of Thinking, London, Paladin Graftin Books.

2011-12-17

Thinking some more about learning

In reading Nicholas Carr recently, it became clearer to me that we need to look more closely at the relationship between learning and technology, in particular learning and the technology of the Net (hereafter used to refer to both the internet and worldwide web as we find them today).

One of the reasons for introducing technology-enhanced learning into the educational system was to take advantage of whatever it is that the Net has to offer. What is this? Carr maintains that it is not what we think it is. He believes that there is a growing body of evidence that indicates that our ability to understand is being challenged by the Net. What does this mean? Understanding is our ability to give meaning to the world around us, to our lives. It is a matter of establishing a frame of reference against which (hopefully) sound judgments can be made. These judgments may be about which smartphone to buy or about whether we invade a foreign country.

Every judgment we make has, to be sure, a factual component, but I suspect it has a moral component as well. (Which smartphone, or whether a smartphone says something about the buyer's view of the place of such technology in their lives and its potential impacts, say, on the environment. This is much closer to the moral dimension but is certainly not in the forefront of the buying decision. Where is should be is another issue that will have to be discussed elsewhere.) Understanding, hence meaning, is what makes our lives worth living, and so long as we have a why, as Nietzsche noted, we can endure almost any how. It is not a mere intellectual game, a cocktail-party sport, to consider what the world might become should we no longer have a basis for making a why-judgment.

The Net demands that we make many, quick decisions. Hyperlinking demands that we decide, at a minimum to-follow-or-not-to-follow. (That has become the question, but who cares any more about the Prince of Denmark?) This constant decision-making overloads the pre-frontal cortex to such an extent that the brain is more involved in deciding than it is in transferring material from working to long-term memory. If we do not do this, we no longer understand, for the underlying schema upon which, or against which background, long-term, life-relevant decisions are made is undermined. It atrophies. As Carr puts it, we become "mindless consumers of data" (p125). I'm not sure this is what we want going on in the classroom.

Reference
Carr, N. (2010) The Shallows: What the Internet is Doing to our Brains, New York & London, W.W. Norton and Company.

2011-12-15

Thinking about learning

Learning situations are especially complex. Everyone involved has his or her own predispositions, needs, wants, desires, experiential background, schooling, social environment and more, all of which impacts learning. Learning, we may say, is that process, in which the individual is changed (or is different at the end) in some way, be it in terms of knowledge, skills, competence, values, or their view of the world. In order to compensate for the variability of the individuals involved, a clearer understanding of the learning process itself can be helpful.

What is learning? How does an individual learn? What factors impact learning or influence learning? How are we able to take what we "learned" in one situation and apply it in another (transfer)? What roles does the mind or brain play in the process? These are the types of questions that a theory of learning should answer.

All theories are based upon certain assumptions that we make about the world around us. These assumptions are based on a number of factors, such as culture, language, or zeitgeist, among others. These assumptions sit deep and are, for the most part, self-evident, given, and unquestioned. For example, behaviourism, the dominant learning theory in the first half of the 20th century, postulates (assumes) that all learning manifests as observable and measurable changes in behavior. Cognitive learning theory, the most dominant theory in the second half of the 20th century, by contrast, attempts to explain learning in terms of brain-based processes, which are often compared to or measured against computers, an assumption which in turn is based upon a certain understanding of the notions of information and data processing. Constructivism, which is currently very much in fashion (in a variety of flavors), takes the view that the learner him/herself is actively involved and engaged in constructing his or her ideas and concepts, whereby language, culture, and experience all play significant roles in the learning process.

Unawareness of these underlying assumptions can lead us to conclusions that may not be as generalizable as we first thought. We cannot say that behaviorism is "wrong". There is a considerable body of evidence that documents that it works. We cannot say that cognitivism is "wrong", for here too there is much evidence that in certain situations and under certain conditions it works. The same applies to constructivism. What we can conclude from this is that there may not be one, encompassing theory of learning, or, it may be that certain theories are more applicable to specific types of situations or types of learning, whereas others apply in other contexts or situations. In other words, these may not be mutually exclusive theories, but could even be complementary. Unfortunately, we do not have enough evidence to know which might be applicable in which situations, but based upon the evidence that we have, it should be possible to make reasonable conclusions in this regard in relation to the given material, the given situation, and the goal that is intended to be reached.

2011-12-13

Technology and education

Neil Postman draws support for his views on re-valuing education from Theodore Roszak who advances the idea that the educator's love of technology is in essence the undermining of the institution s/he wishes to strengthen. By exploring the folklore of computers, that is to say, "the images of power, the illusions of well-being, the fantasies and wishful thinking that have grown up around the machine" (Roszak, p9). His attention is directed to the fact that the Information Age has "now entered the educational curriculum in an aggressive and particularly insidious way which could distort the meaning of thought itself" (p10). He goes on to say, "The burden of my argument is to insist there is a vital distinction between what machines do when they process information and what minds do when they think' (p11). In fact, "If educators are finally swept into the cult [of information], we may see the rising of a generation of students seriously hampered in its capacity to think through the social and ethical questions that confront us as we pass through the latest stage of the ongoing industrial revolution" (p12).

Whereas the push of computers into the classroom was in the ethereal and evasive notion of "computer literacy", we have since then come to be convinced to a large extent that there now exists a generation of 'digital natives' (Prensky, 2001a, 2001b) who, having grown up with digital technology, now show a natural affinity for it, an affinity that is not shared (and sometimes implied not developable) in those born prior to 1990 or so. While there is little evidence to support Prensky's claim, there is a growing body of evidence that this generation does not exist, at least not in the form Prensky envisions it (see Kennedy, et al., 2007 among others). Nevertheless, it remains one of the most widespread and insistent notions circulating in educational circles.

It remains to be seen just what the computer in the classroom is good for. Experiments with programmed instruction, drill and repetition and the like have not brought the results originally promised by their creators. In fact, Roszak argues convincingly that the oft touted simulation, considered one of the more recent and even more powerful e-learning possibilities, may in fact do more harm than good by depriving the learner of the experience of both failure and the complexity of interaction with the real world. He is realist enough to know that the vast majority of education-leavers will not be entering the highly paid, exclusive segments of society, rather they will, like their forebears, be forced into marginal employment and socio-economic status. He insightfully points out:

"One might almost conclude from this fact that what the young most need to defend their interests in life is an education which will equip them to ask hard, critical questions about that uninviting prospect. Why is the world like that? Who made it that way? How else might it be? There are subjects that, when properly taught, help people answer those questions. They are called social science, history, philosophy. And all of these are grounded in the sort of plain, old-fashioned literacy that gives inquiring minds access to books, to ideas, to ethical insights, and social vision." (p72)

We have some serious re-thinking to do.

References
Kennedy, G., et al. (2007) "The net generation are not big users of Web 2.0 technologies: preliminary findings", ICT: Providing Choices for Learners and Learning [online], Proceedings ascilite Singapore 2007, http://routes.open.ac.uk/ ixbin/ hixclient.exe?_IXDB_=routes &_IXSPFX_=g&submit-button=summary&%24+with+res_id+is+res19981 (accessed 2 February 2010).

Postman, N. (1996) The End of Education, New York, Vintage Books.

Roszak, T. (1986) The Cult of Information: The Folklore of Computers and the True Art of Thinking, London, Paladin Graftin Books.

Prensky, M. (2001) "Digital Natives, Digital Immigrants", On the Horizon, MCB University Press, vol.9, no.5; also available online at http://www.marcprensky. com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants %20-%20Part1.pdf (accessed12 January 2009).

Prensky, M. (2001b) "Digital natives, digital immigrants, Part II: Do they really think differently?", On the Horizon, NCB University Press, vol.9, no.6; also available online at http://www.acpinternational-dc.org/articles/digitalnatives2.pdf (accessed 13 February 2009).

2011-12-11

Technology in education

Neil Postman advances a passionate case for re-valuing education in America, but much of what he has to say would apply in other countries as well. For him, education, at heart, is about finding and developing shared meaning, as one finds, for example in narrative, one's story, a nation's story or any story for that matter: "Without a narrative, life has no meaning. Without meaning, learning has no purpose. Without a purpose, schools are houses of detention, not attention." (p7) The primary function of education, however, has been ever more directed to utilitarian aims, such as vocational qualifications or the more sinister 'employability'. The stronger such thinking becomes, the more educators are willing to take an engineering approach to education. If it is planned and designed properly, we will get the greatest value from it. Postman notes, however, that there is no one right or best way to " to know things, to feel things, to connect things" and he goes so far as to maintain that making such a claim in fact trivializes learning, reducing it to a mechanical skill. (p5) I couldn't agree with him more.

This attitude has much to do with our modern attitude toward technology, especially among educators. The zeal with which many advocate technology borders on religious, as he points out. For

"[…] at some point it becomes far from asinine to speak of the god of Technology in the sense that people believe technology works, that they rely on it, that it makes promises, that they are bereft when denied access to it, that they are delighted when they are in its presence, that for most people it works in mysterious ways, that they condemn people who speak against it, and that, in the born-again mode, they will alter their lifestyles, their schedules, their habits and their relationships to accommodate it. If this is not a form of religious belief, what is?" (p38)

What he advocates is 'a serious form of technology education' (p43), that is, 'making technology itself an object of inquiry' (p44), hence, the role of technology in technology-enhanced learning (TEL) or education in general is worth taking seriously and looking at critically.

As I've mentioned before ("IT Envy", 2011-12-09) we should be technology-enhanced education not technology in education. I believe it is critical to put technology in its place. Technology is a helpmate, not an end in itself, and when dealing with education, it is particularly important to keep this in mind. In part, it is the technology-centred, engineering-affine approach that is so often advocated that is the motivation for pursuing this particular thesis topic. One of the primary purposes of the thesis is to place the notion of design, a function of technology, soundly in the service of teaching, learning and education, not to be their master.

Reference
Postman, N. (1996) The End of Education, New York, Vintage Books.

2011-12-09

IT envy

The promotion of data to information and the reduction of knowledge to information have farther-reaching consequences that we may want to admit. This is particularly true when we link it to our obsessive compulsion with "having", with owning, with property. Remembering Mr. Eliot, we can say that wisdom is a priceless pearl, knowledge an expensive luxury, but information … well, that's a commodity that can be bought and sold just like any other. IPR shouldn't stand for "intellectual property rights", it should stand for "information property rights", for in so many cases that is all we're really dealing with.

Our current motto "s/he who dies with the most information wins" loses its whimsical nature. It used to be (at least in Francis Bacon's day) that knowledge was power, today information is power. Ergo, whoever has the most and can process it the fastest, turn it around and fire it back the quickest, who can generate and accumulate the most is the winner (read: the best, the smartest, the most powerful …). And what can do just that? Of course, the computer. And so we raise the machine to our ideal, we yield to its unerring accuracy, its lightening swift sifting, sorting, and filtering, we pay homage to its power. Yes, in a sense, we begin to worship it for it's godlike power.

You think I'm exaggerating? You should think again. This might be how I observe things, but I'm certainly not the only one to have noticed.

Günther Anders, the contemporary German philosopher has gone so far as to argue that we have in fact become ashamed of our humanity, of being born; we'd wish deep down, in our heart of hearts that we could be made like our computers. The first time I read this, I was taken aback, I'll admit, but if you take the thought seriously, Mr. Anders is onto something. Have you ever heard the argument that a given course of action is the best, because someone had "crunched" the numbers and that's what came out. In that moment, we don't even think of questioning the outcome: they're numbers, the computer said so, who are we to argue? Money never sleeps, electronic trading never tires nor errs, businesses are bought and sold, hundreds of millions of lives are affected, both directly and indirectly, business deals are automated all without ever being touched by human hands. One of our secret-most desires to is simply get those fickle humans out of the equation. Whenever they get involved, let's face it, they just screw things up.

We place way too much faith in digital technology, in information processing. The information-processing model of mind is the dominant theory these days, but it couldn't be farther from reality. It might seem like a mere metaphor to describe an exceedingly complex phenomenon. But we've gone beyond the metaphor and now mistake it for the reality. That's the dangerous step, but it's one we should perhaps think about taking a step back on.

Reference
Anders, G. (2009) Die Antiquiertheit der Menschen I, 3rd edn, Munich, Beck.

2011-12-07

The man behind the curtain

The "information" gadget non plus ultra is the computer. Don't we just love our computers. It doesn't matter what size, shape, color or speed, we love them. They're everywhere, too. Our traffic is controlled by computers, they help us fly and land planes, they control much of our driving ability, they collect data against terrorists (or innocent people, it doesn't matter, as long as they are collecting data), they get us to the moon, Mars or Jupiter, they guide our missiles, they give us our passports and they send us notices from the finance authorities. They are simply everywhere. We can't work without them anymore. In fact, we can't live without them anymore: from copy machines, to (not-so) smart phones, to laptops, notebooks, and desktops, they are an intimate part of all our lives. We can't get enough of them and we have no idea what they do to us. We think we know what they do for us, but it's a lot less "for" than "to".

Still, I just love the words we use to describe them: fast, powerful, smart, and – my personal favorite – sexy. They're machines. They're things. OK, Steve Jobs and Apple tried to make them accessories and furniture, but they didn't quite succeed. Computers have receded into the nethermost corners of our lives, and these are the most nefarious, because we take them for granted. But the ones we "have to have", that we flash around, are accorded a reverence that they may not deserve. What's so special about them anyway?

They sit on our desks and stare us down workday in and workday out. They spit out reams of tables and figures that we don't have the time to double-check so we take them as correct. They give us access to others because we hardly get out at all anymore. They get us things (downloads), make our lives convenient (just ask Amazon), and control every move we make, every thought we think, and every moment we'd like to rest. That's not what I call "special". But, in spite of it all they can get more bits to more places faster and more reliable that has ever been possible before. And, it is this simple illusion that makes us think that "we are making real cultural progress – and the that the essence of that progress is information technology". (Roszak, 1986, p. 29)

So, is that progress? Is this what we understand progress to be? It is becoming ever more difficult to distinguish between what is real and what is not. Is that what we want? Have we ever stopped to ask ourselves how substantial all this information technology is? I don't really want to believe that progress is an illusion, but when we stop to take a look, things may not be as real as we would like to believe. Simply pay not attention to the man behind the curtain.

Reference
Roszak, T. (1988) The Cult of Information: The Folklore of Computers and the True Art of Thinking, London, Paladin Graftin Books.

2011-12-05

Roszak and Eliot

If you have never read Theodore Roszak's The Cult of Information, you don't know what you're missing. It should actually be mandatory reading for anyone who thinks they have something to say about technology or the so-called information age. Of course, to get anything out of it, you'd have to approach it with an open mind, so if you're unwary of technology or downright obsessed with it, it would be better you just let it go. I don't contend that everything the man says is right, but I would argue that what he has to say is worth thinking about. William James once remarked that what most people call thinking is simply a rearranging of their prejudices, and I can't say much has changed since he said it. But, if you are willing to earnestly consider a clearly stated position on a well-defined subject, I would say the time spent with the book would be more than worth the effort.

There was a time – and not all that long ago – when a distinction was made between some very similar things. For example, data was just whatever it was, a date, a color, a statement, a fact. Information was something "more": it was data that was used to make a decision, or at least contributed to the making of a decision. Knowledge was something "more" than information; it was something one knew, what could be used to act in an informed way, to exercise a skill or provide an argument. Finally, at the top, we had wisdom ... well, who even knows that that is any more? This isn't a new phenomenon, I'm afraid. In 1934, the poet T.S. Eliot wrote (in The Rock):

Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?

He knew where we were headed ... and what do you know? We're finally there. We no longer talk about data at all, and everything else ... and I mean everything else ... has been simply turned into information.

If the motto of the 80s was "whoever dies with the most toys wins", the motto today is "whoever dies with the most information wins". Though I would say that whoever dies, dies. It doesn't really matter what they have when they do.

Let's face it, we love information: from baseball statistics to football scores, to monthly rainfall measurements, to the mileage we get with our cars, to stock-market prices and indices, to interest rates, to the most common boy's name of 1913, to the number of jobs not created since the latest tax cuts. It doesn't matter what it's about, as long as it's what we think is information, we love it.

In this regard, the title of Roszak's book is not all that far off. We've made information and the acquisition of information a kind of cult. And as is the case with every cult, it needs it priests, and there is no shortage of them either. We call them "experts" these days.

2011-12-03

Luddite?

Is it just me, or is there something wrong in calling someone a Luddite just because s/he is not obsessed with technology? I mean, is technology all that we have to say or show for ourselves? And what is technology anyway? And who decided that it's our be-all and end-all?

Recently I read an article lamenting that we were risking our children's futures because we weren't teaching them to program. If they didn't learn that, they are doomed to become unfit for the world of tomorrow, we're cheating them of opportunities. Today, three-quarters of the population of Germany, for example, has a driver's license, but how many of these almost 60 million people can fix a car? Moreover, 100 years ago when we were just getting rolling in the automobile society, how many of anybody, let alone educators, were running around claiming we would be robbing our children of their futures if they didn't learn automotive mechanics? What makes digital technology so different?

Well, nothing fundamental, at least as far as I can tell.

Cars are a kind of tool, one that helps us get from point A to point B. They were clunky, temperamental, difficult to operate at first, but over time, they got easier and more comfortable to use. What is more, they are becoming so reliable that some are speculating when we'll be able to produce cars that won't need maintenance anymore. Relatively speaking, we've come a long way in the 125 years since Herr Benz registered his patent.

Computers are also a kind of tool, one that helps us do other things. They were clunky, temperamental, difficult to operate at first, but over time, they got easier and more comfortable to use. What is more, they are becoming so reliable and so compact that some are speculating when we'll be able to produce computers that are always on and always connected ... for everyone. As things move much faster in the computer world, we've come relatively far relatively faster than we did with cars, but the developments resemble each other in important ways.

What we failed to ask ourselves then (with cars) that we are failing to ask ourselves now (with computers) is what this technology is really good for? Back in the technological ecstasy that abounded, very few were asking about the worth of the technology. Urban sprawl, environmental pollution, junkyards, the Rust Belt, resource depletion, and many, many more issues were not thought about and not of much interest. The technology was going to do it for us ... now the technology is doing it to us.

A Luddite was one who opposed technological progress not technological obsession. We don't really have a technology problem today, we have an obsession problem. Of course there are jobs and incomes and revenues and stock prices that are intimately connected to the technology, but just because we have linked them now doesn't mean we have to keep them linked forever. No matter what we decide to do with our world, jobs and incomes and revenues and stock prices will be intimately connected to it. Obsession, however, is a serious signal that something is not right, that there is something unhealthy afoot.

If we really want to do something for our children, I think it would be wiser to teach them how to remain healthy.

2011-12-01

Occam's razor

In keeping with the last entry, I'd like to continue with the theme of simplicity. Last time we saw that the world is by nature complex, whereas most everything we humans get our hands (feet, minds, ...) on gets complicated. What is more, complicated things are difficult, but complex ones are not really.

Sure, I can hear all you scientists out there yelling that we've only begun scratching the surface of understanding of many of the world's phenomena, from the human genome to the intellectuality of one-celled organisms to parallel universes to slicing bread. But, we should not mistake the map for the territory. I fully agree that we've only begun scratching the surface because from the onset we are complicating things. Complex things - with a bit of patience and perseverance - can be grasped, but we have to get our egos out of the way and let our consciousness work its magic.

Of course, since we're such technological creatures, we need tools. We love tools. Most of us don't work well with tools, but we love them anyway. And one tool that can help us out here is Occam's Razor. Well, this isn't really a razor like the one's we use to shave with, rather the term refers to a way of thinking, a sharp way of thinking, a heuristic, and by thinking sharply and cleanly, we can avoid a lot of complication.

As a side note, the tool is named after William of Ockham (ca. 1285-1349), an English logician and Franciscan friar, to whom it is credited although he didn't come up with it. The same thought as the razor are to be found in Maimonides (1138?-1204), Thomas Aquinas (1225-1274), and John Duns Scotus (1266?-1308). It was Sir William Rowan Hamilton (1805-1865) who first used the term itself.

The form most often used by Occam himself is numquam ponenda est pluralitas sine necessitate, or, for those of you who opted out of Latin in school, "plurality ought never be posited without necessity". This should be thought along with the principle of economy which was well known to him and his predecessors, namely frustra fit per plura quod potest fiere per pauciora, or "it is futile to do with more things that which can be done with fewer". One of the most common formulations today is

Explanations should never multiply causes without necessity; when two explanations are offered for a phenomenon, the simplest full explanation is preferable.

So, I think you can see where I'm going with this. The next time you listen to all those explanations on the news why gas prices must be tied to oil prices, or why it "makes economic sense" to ship tomatoes from Florida to Mexico to be packaged so that they can be shipped to New York for sale, or why there is allegedly some inalienable right for private citizens (as opposed to citizens who are part of an official militia, like the Swiss) to own guns, or that a society can afford to pay CEOs hundreds of millions but can't see to it that single mothers can make the rent, or why any one religion got it all right while all others got it all wrong (... the list goes on ...), think of William and his razor.

It's been a long time since William and longer since we took him seriously, so let's take his razor then as the goal to strive for. In the meantime, any simple, sensible explanation will do. Do we still have it in us, or have we lost it completely?

2011-11-29

Complex or complicated?

Wondering is something I do often, though I don't now how well. Still, I have been contemplating a pair of notions again, namely complicated and complex. Maybe it's because the subject came up at a recent strategy tutorial. Maybe it's because I spent three days last week immersed in the details of a software application that had little relevance to the world out here (but we absolutely have to have it). And maybe it's because there's simply a big difference between the two concepts.
Obviously, they have quite a bit in common: lots of little part, lots of relationships, lots of interconnectedness, lots of unknown beginnings and ends, lots of back-and-forths, lots of gives-and-takes.

Take any living organism, from single-cell plant or animal to human beings. What a source of wonderment, fascination, and awe, regardless of how you think they got here. The same applies to inorganic substances as well: just look at a petroleum molecule or a salt, and you will get the gist of what I mean.

What all of these have in common is an innate sense of beauty, be it the chainness of certain molecules to the perfect symmetry of crystals, the complexity brings forth shapes, colors and textures that literally take one's breath away. What this also have in common is that they are natural, that is, they occur in the world as we find it. And what they also have in common is that each of these examples is complex. It all fits together. It all works. The elegance of functionality wrapped up in the complexity of its being.

Complicated, on the other hand, is a bit different. Sure there is what appears to be complexity, but it doesn't really work. Software systems, government systems, pension schemes, businesses large and small, school systems, education systems, security systems ... the list goes on and on. We make these things taking nature as our (role) model, but somehow we just don't get it right. Let's face it. Most things are pretty well broken. Even five minutes of American infotainment lets that cat out of the bag.

And all these things (these machines, systems, programs, organizations) also share a common characteristic: they're ugly. Who's had stood in awe of the Social Security Administration? Who has had their breath taken away (in a positive sense) when passing through customs at the airport? Who has been (lovingly) brought to tears by the site of the garbage piling up on the streets? I don't think so. But all of these things have another characteristic in common: they're man-made, as we used to say, person-made, if I'm to be politically correct (though I have to agree with Rhett Butler here: Frankly, [...] I don't give a damn!).

So, it would seem that at base the difference between the complex and the complicated lies in the simple fact that if we've had our fingers in it, it's complicated; otherwise it's probably complex. It's as simple as that, for in the end, life is a rather simple affair. We (choose to) make it complicated.

2011-11-27

Propaganda

Picking up on the theme of my last post, it occurred to me today that a very important word is both losing favor -- most often simply through misuse -- and is also shifting it's meaning. This time the shift is not toward another word in the same domain, if you will, rather it is sliding into a separate domain and will be overwhelmed by the current standard term there. That word, of course, is "propaganda". We should keep it around, I think. It's a good word, and it actually communicates quite a bit. But, what is propaganda anyway?

If we stop to reflect for a minute, we see it is a form of communication. It's primary purpose is to influence its listeners (or readers or viewers, depending upon the media). The influence should be in a particular way, of course, that is, in favor of the position or assertions being made by the issuer, most commonly but now hardly ever known as the propagandist. The influencing is accomplished by presenting facts selectively. That doesn't mean it's an outright lie, rather, the selection allows the stater to simply leave things out in order to encourage the listener to draw particular conclusions. What is more, propaganda is generally presented emotionally. Emotion, not reason, should be directing the decision-making processes of the listener. That's how we generally understand the term these days.

Granted, this was a political term for the longest of times, but I don't hear it so much in that domain of discourse any more. I do see exactly that happening on a daily basis, and that is my cause for concern. We are all confronted with it day in and day out ... Americans and Brits more than, say, Germans or the French, but no one these days is immune to it. We know it as advertising.

Now before our marketing friends get all weird on me, stop and think about it for a minute. Advertising is a form of communication. It's primary purpose is to influence its listeners (or readers or viewers, depending upon the media). The influence should be in a particular way, of course, that is, in favor of the position or assertions being made by the advertiser. The influencing is accomplished by presenting facts seletively, also in order to encourage the listener to draw particular conclusions. And I doubt that anyone would disagree that it is presented emotionally. So where, I ask you, is the real difference?

I don't see one, so I'm sticking to my guns.

Generally, things like this wouldn't bother me all that much, but I am reminded of McLuhan's aphorism on the one side: the price of eternal vigilance is indifference (we simply get worn down by it all, throw in the towel, and end up with too big a car payment or a world war or whatever). On the other side, there aren't enough people these days educated enough to (a) know what propaganda really is or (b) have been taught or allowed to think critically enough to defend themselves. We can all do something about (a), but (b), well, that's a bigger problem ... and one that's bigger than I want address at the moment.

The next time you're being schlocked or enticed, or being amused to reach into your pocket for money, I'd suggest you stop and think. Of course, what works in marketing, well, that should be right and proper for politics now, shouldn't it. Think about it.

2011-11-25

Say what you mean

What do you do when the words you've used all your life are no longer usable? Not that they're worn out, no, just that they don't mean what they used to. Some came along and changed them. Is that even possible? Well, truth be told it is.

It's not a new thing, really. Not two hundred years ago if you used the word "awful" it was a good thing. What you were experiencing, looking at, feeling, filled you with awe. Not Rumsfeldian awe, real awe ... wonder, admiration, the kinds of feelings many folks used to have toward G-d, when there was allowed to be One. Some "awesome" only had some awe, and not necessarily good awe ... pre-Rumsfeld, but more that kind of awe. In other words, it was a relatively negative term. Today though, they've switched places.

This is, I suppose, a natural process of language, but I missed the awful/awesome switch, and I can't say I'm sorry. But the one we're seeing now is a little more troubling to me.

I'm talking about very basic political orienting terms like "conservative" and "liberal". Growing up, I always thought that conservatives were people who wanted to conserve, keep, protect, hold on to something. And for the most part, that's what it looked like they were doing, even if I didn't particularly agree with them. On the other hand, the root of the word "liberal" is the Latin word for freedom, that is, liberals were the ones advocating change, more often than not in the sense of more individual and collective freedoms, even it I didn't agree with where they were going with some things. That's all changed now.

We may talk about neo-liberal economics, but the "neo-" prefix is a matrixian sleight-of-hand to make you think it's about the free flows of money or something. It's not, it's about restricting those flows to some unspecified club of folks who think that agreeing on bonus contracts is a more valuable skill than actually protecting people's savings. Of course, our friends at Faux News like to use it as an epithet to simply discredit anyone they don't want having any say. This is perhaps the most heinous usage, but it does seem to be the one that is catching on, so we have all of this "ilk" being tossed into the same pot for regular stirring.

On the other hand, I had to sit up and take notice when I read a recent article by the linguist George Lakoff, who rather poignantly pointed out that the upswell in alleged conservatism in the United States is anything but that. Instead, they are actually redefining and recasting a number of notions that I once thought I knew what they meant, like privacy, democracy, values, and basic human rights. Just when I thought I was beginning to understand the world and what people were saying, I find out -- once again -- I don't. Oh well, back to the dictionary, I guess.

Like I said, shifts in meaning are a natural part of language, and I actually believe that the awful/awesome switch was the result of a natural process. What is happening now, however, is quite a different story. I guess I have to be on my toes to find out what some people really mean. I wish they would just say what they mean ... and mean what they say. But maybe I'm not supposed to know.

2011-11-23

We're going where?

We've never had it so good ... or so I'm told. But have you ever stopped to think about anything you thought you already knew? For example: what does "good" mean in that opening line? Do we know what "good" is? Do we all agree? Do we agree generally or specifically? Or, what about Good (with a capital "G") or "the Good"? It's not something we do everyday, but it is something that we should perhaps do more often.

Actually, I've got a number of favorites, if you will, and among them are "better" and "progress". I picked those two because they are often related in our thinking. I mean, you often hear them used in the same sentence: "We have progress to thank for the fact that things are better now than they ever were before." Granted, we need to know what "things" we're talking about, and we also need to ask "where" the speaker is (I'm not sure a mother watching her child in Somalia starve to death is thinking that particular thought).

But let's just stick with the "better" and "progress". First, however, a quick grammar lesson: the word "better" is a comparative. It is the comparative form of "good" (good, better, best; whereby "best" is the so-called superlative). If we were wondering earlier what "good" is, we have to wonder even more about whatever it is that's more than that. The point is, however, that "better" can only be seen in relation to what we believe to be "good". If we don't agree on the starting point, the next step is always more difficult. And then we have "progress". We probably generally agree that when we speak of progress we are speaking about some sort of advancement (an implied "better"), some kind of moving forward (wherever that is), or movement towards a goal. So, where has "progress" brought us that everything is "better" than it was before?

Let's take one of my favorite topics: time. We're a fairly techno-crazy generation: smart phones and iPads and broadband and personal navigation systems. Truth be told, one of the promises of the Industrial Revolution were inexpensive machines and labor-saving devices. Where did they all go? All our most modern gadgets are anything but time-saving. Always on, always accessible, always connected ... and the family? Automation promised us more free time, more leisure, but how many of us have any leisure at all anymore? What do "power weekends" have to do with relaxation? At the time of the American revolution, the average person worked about 1000 hours a year, and the whole basis of our economy was agriculture and crafts. My relation of mine told me that when he started his job at the bank, he was required to put in 140 hours a week ... not in writing, of course, but nevertheless "demanded". The average American is now working 3 jobs (if they're working), mostly part-time so there's not a lot of time with anybody, even if they do have a family? I read recently that had we plowed all the time we saved with industrialization into the work week instead of into the pockets of a few we'd only have to work about a day a week (at livable wages), but instead, everyone in the Western world is being forced to work more and more for less and less (real wages in the US have been stagnating or declining since 1979).

And so my question: is this progress? is this better? I don't think so. But, that's just me. Each and every one of us should take a bit of time and reflect on just what's good and what might be better. It's not just a matter of personal preference, I'm talking about something much more fundamental. I'm talking about real quality of life, not just the promise of it.

2011-11-21

Fear of creativity

There is a huge difference between how things are and how they could or should be. Likewise, there is a huge difference between how things are and how they got that way. We need to think about both, though I sometimes get the impression that we don't really think about either.

Let's take Occupy Wall Street as an example. It hasn't ceased to amaze me at how many people who make statements about it haven't looked at it close enough to know what it is. In other words, their perception is blurred. It also hasn't escaped my attention that some want it to be one thing, while others want it to be something very different. In other words, their interpretation is confused. Finally, there are those who will tell you what it is not, who will try to discredit it for discreditation's sake. In other words, their comprehension is incomplete.

Contrary to what (too) many people believe, being against one thing doesn't automatically make you for its opposite. Also, agreeing part of something doesn't automatically mean that you agree all of it. For example, I might be pro OWS, that doesn't mean I'm automatically against capitalism. I can agree with them that corporations have undue influence in the American political process, but that doesn't mean I agree with all the tactics they employ to make their point. In other words, what I'm making a plea for here is discernment.

All that we are dealing with in the wake of the financial crisis and bailout are things that needn't be the way they are. Anything involved could be different if we choose to make it so. We have made money the measure of all worth, because we can't agree on higher values anymore. We think the economy is more important that society because we can't talk to each other reasonably anymore. We have fear of losing what we have because we no longer know who we are. Every issue we are facing are problems that we humans have created for ourselves. Money is not natural; the organization of human society is not natural; economies are not natural. They are all things that we, in one way or another, for one reason or another, consciously or unconsciously, have simply made up. None of these arise from some unalterable natural law, rather cultures, living spaces, ways of interaction, values, organizations are things that we humans have called into existence. They are as they are at the moment, but we all agree that they were not always that way, nor need they necessarily be that way now. They are as they are because we more or less decide to have them this way.

We, I'm afraid, have simply backed ourselves into a corner and are being controlled by our fears. Is it really an unalterable fact that some banks are too big to (let) fail? No, that's a perception. Is it the consequence of some natural law that some countries are seen as too big to save? No, that's an interpretation. And do we really, truly believe, well, things have always been this way? No, that's simply miscomprehension.

My question is, what are we afraid of? Of losing what's not ours to begin with? Of having to listen to someone we disagree with? Of respecting another simply because s/he is an other? Of having to share or (gasp!) care about something other than ourselves? Of having to think about how you feel about something that affects all of us? Of maybe having to learn that whatever you think, feel or believe is one way of thinking, feeling and believing, but not the only way? Of finding out that just because I don't think what you do that both of us may be right (or wrong) in different ways? I don't get it.

Anyone who thinks that things are just fine the way they are simply doesn't get out enough. I would maintain the changes we need to make are simpler than we now imagine, but only if we start working together. It's a choice, not a law of nature.

2011-11-19

Corporations are people?

Two recent events got me thinking about the corporations-are-people ruling that have made it clear, just how sad that decision was. It is an excellent of example of simply not thinking a thought through to the end. The first event was a sign held up at a recent Occupy Wall Street protest in New York, which read, "I'll believe corporations are people when Texas executes one." The second was a statement Marianne Williamson made in her talk to the Occupy Wall Streeters in Berkeley, that "Corporations can't be people, they don't have souls." While each of these statements certainly have their merits, it would be too easy for my atheist friends to discount Ms Williamson's statement, so I thought it would be worthwhile to revisit the issue in a more mundane way. After all, the only word that springs to my mind when I hear the corporations-are-people phrase is "absurd." Here's why.

We can argue about whether people, that is, human beings have souls. That's fine. However, regardless of how one thinks about that, I would suggest we all agree that you can't be just part of a person, or part-person. That is the essence of what we call "discrimination". You can lose a limb or just have your appendix taken out, and you are still a person. You can have a birth defect or suffer a debilitating injury, and you are still a person. The whole part-of-a-person argument was actually put to rest when the 14th Amendment finally overcame that ridiculous idea that blacks were only 3/5 people. (In other words, this isn't the first time we've had to face absurdity head on.)

The consequence of this thought is that corporations can't, then, just have some rights and not others. If you are a person, you enjoy all the rights that other persons have, not just the freedom of speech in certain instances, like elections. This is what our first protester was getting at. Now, I'm no advocate of the death penalty, but I have nothing in principle against jail, so why can't corporations be sent to jail? It's simple enough: corporations can't act on their own, but their boards of directors, first and foremost, and their owners (read: shareholders) can, and do. If the corporation does something illegal, then, as was the case leading up to the recent financial crisis, then we know who should stand trial and if convicted go to jail. These two groups represent the head and heart of the corporation. You shouldn't have to go looking for some individual within the organization who should be punished for wrongdoing, the corporation did the wrong-doing, and that is who needs to answer for their actions. That's what it would mean for corporations to be people.

These days we like to pick and choose, to pick what's best for us and just ignore the rest. If "the markets" are comprised of "investors", as the name implies, they have a vested interest in the actions of the corporation. As many like to point out, they are the corporation. And their interest cannot be only to derive benefits and not have to face the consequences of unpleasant or even illegal decisions. Yes, I know, when you think about the sheer numbers of people that would have to be incarcerated the mind begins to boggle, so to save ourselves from this ridiculous fate, we need to let the people in the judicial old-folks home in DC know that they haven't thought their thought through to the end.

2011-11-16

What can I say?

It struck me recently that not only are attention spans diminishing at a frightening rate, and not only are sound-bites getting shorter, they're also becoming emptier. When is the last time you heard a clipped quip on the news that really made any sense? I'm guessing it's been a while, because I'm having trouble finding people who say anything anymore. You would think in a situation like this one, words would become all the more important, but it seems to me that they are being emptied faster than the phrases that are supposed to give them meaning.

How about "free market"? There's one that gets bandied about a lot. My question is, though, just how free is a market if a non-market player steps in and hands over almost a trillion dollars to keep it alive? I'm not sure I understand either part of that phrase in that context? Or, how about "liberal"? I like that one too. It derives from the Latin liber, meaning free, but it is used at least in America these days as an epithet for socialist dictatorship, which is about as "unfree" as you can get. How does that happen? Or, what about that other favorite epithet of my youth, the really bad tag to hang on someone, "communist"? We don't use that at all any more as a term of derision. Why is that? Oh, right, because the only remaining communists in the world, the Chinese, our preferred trade partner, have outed themselves as the biggest capitalists of all.

I think I'm onto the problem, though. We don't use many words as words anymore. We prefer labels. Label someone as something, and you don't have to talk to them, you don't have to discuss anything with them, you can simply ignore them, because who would wanted to be associated with "those types"? This has long been a ploy in politics, but it has begun to permeate every aspect of our lives. We find the technique employed in the media, in schools, most definitely in sports, but at home as well.

I'm not the first to notice this, believe me. These are what Pörksen calls "plastic words", they are "argument killers": throw these words in to a discussion or debate and it grinds to a screeching halt. There's no way to counter them: they are so embedded in our personal ideologies that no criticism will be tolerated. But that's the problem: they are the means of argumentation of intolerance. It would do us well to start thinking about words and then start using them as they were intended: as a way of communication.

2011-11-14

Sunday without sports

Not being much of a movie-goer, I decided to make it a double-feature yesterday ... but I was at home. There's way too much noise and confusion in the theaters these days. What an exciting time: explosions, theft, deceit, intrigue, obscure twists and turns, tension-filled climaxes and even the occasional ray of hope. Oh, what was the double feature? Inside Job and Capitalism: A Love Story.

OK, maybe "exciting" was a bit misleading, but I was on the edge of my seat ... mostly because I wanted to jump up and scream. Are these two films portrayals of absolute truth? Of course not. Are they particular views of recent events? Of course they are. So what's there to get upset about? It's simple: if any of it is true, we're in bigger trouble than we want to admit, and the integrity of everyone involved in perpetrating this trouble has disappeared in a puff of smoke.

Henry Ford once remarked that if the American people understood the banking and monetary system of the country, there would be a revolution tomorrow morning. For whatever distortions the film-makers may be accused of (and they will be accused, I am certain), either film (but better both together), can serve as an initial primer on what not only our banking and monetary systems have done to us, but on why capitalism does not really do all that much for us, but rather on how it does it to us.

Part of the difficulty in understanding that simple distinction lies in the fact that we moderns are having trouble finding an agreed upon sense of values. We can talk about social values or even family values, but in the economic ideology called capitalism, only one value counts: money. It's sad. As I noted last time, given how we've decided to slice up the world, money is as necessary to business as food is to people, but does it have to be the be-all and end-all of our lives? No. We should be eating to live, not living to eat.

When things can only be understood in terms of their monetary value, well, they really have no value at all. You can't put trust or truth or honesty or compassion on a balance sheet. Any accountant will tell you that you don't want them there. But, there are other ways to think about anything, but since it will bring no monetary gain, it is simply not worth talking about it. Yes, that's how far we've come, but we haven't hit bottom yet ... I almost can't wait for the sequels to both those films, but I may not even be able to watch them ... hell, I may not even be allowed.

2011-11-04

Money 101

Recently, I've been giving business and financial institutions a bit of a hard time, so it only seems fair to add a little perspective to the discussion. Though to many it seems that business is only about money, it is nevertheless worth asking ourselves what is the real role of money in business.

Perhaps an analogy can help. Strategy is not just a business concept, it is an everyday concept as well. Strategy, in a certain sense, is the answer to the simple question we have all been asked: "What do you want to be when you grow up?" Each of us makes decisions early in life that determine the course of actions leading us into certain fields and careers. Any life counsellor will tell you that the person who is following their heart (doing what they like) is the happier and more productive person. We look to the future, fix our eyes on a goal, and then continuously work towards it. That is thinking and acting strategically. The only real obstacle to it all though is staying alive. Life is, after all, an exceedingly risky undertaking.

In the business world, staying alive is objective #1 ... not strategy #1. Staying alive is the objective, being something when we grow up is the strategy. In addition to shelter and clothing, we all have to eat, that is, we need food to survive. In the industrialized world, many people used to grow their own, but the specialization of labour has removed most of us from food-production, so we must go elsewhere to get our food. Businesses, in a very strong sense, mirror their creators. What is it, then, that any business needs to stay alive, to live to fight another day in the sometimes dog-eat-dog of global commerce? Yes, money.

There we have our analogy: money is to business as food is to people. Today, we have become divorced from our sources of food; we do not always eat what is good for us. Many people are overweight, others are starving. Certain things are in abundance in some areas, non-existent in others. In the western world, a certain knowledge of nutrition, then, is a good thing. What to eat, how much to eat at what time of day or in which season, different ways of preparing food that make it more digestible and useful to our bodies are all good things to know. Those who inform themselves and act accordingly tend to lead longer, healthier, and, in the end, happier lives. Similarly, in business, setting reasonable profit margins, building reserves (retained earnings), maintaining a "healthy" level of debt, and making acceptable, suitable, and feasible investments are ways toward a longer, healthier business life.

Finance, that is, knowing how to deal with money in the context of business, is not what business is all about. Business is about producing the better product, providing the better service and fulfilling market needs. To do this, however, we need to understand finance so that we can lead the kind of business life that enables us to be whatever it is we want to be when we grow up.

2011-11-02

Stock market 101b

As I mentioned yesterday, businesses have three options to generate extra cash, the third of which is issuing stock. We also saw that a stock issue can be private, but such offerings can also be public. These are the infamous (if at times not notorious) IPOs or "initial public offerings" that get lots of media coverage if they are big enough. In this case, the company decides to sell shares of ownership to the public, in the hopes that the demand for the new stock will raise the share price and thereby generate more cash.

On the other hand, such offerings can also be public. These are the infamous (if at times not notorious) IPOs or "initial public offerings" that get lots of media coverage if they are big enough. In this case, the company decides to sell shares of ownership to the public, in the hopes that the demand for the new stock will raise the share price and thereby generate more cash.

A few years ago, a German low-cost airline went "public" and sold €1,000,000,000 worth of stock on the first day! Not bad, eh? But this is where the "stealing" comes in. They didn't take all that cash home with them. After paying fees and premiums and costs for staging the sale, they had a mere €400,000,000 to take home. I don't think it is out of line to wonder why the people who put on a sale earn more than the folks for whom the sale takes place, but that's another story.

What's worth noting, though, is that this is a one-time deal. Once those shares are in the public domain (on the stock market), they can be bought and sold and speculated with and the issuing company receives no money whatsoever when these shares change hands. If I buy some stock at the beginning, then the company takes home some of that money. If I sell them to my friend Tom a week later, I get money from Tom, but I don't have to give anything to the issuing company. They don't own those shares anymore: I did, and now Tom does.

This is the point, unfortunately, that most people miss. The issue company only has so much to do with its stock as it is concerned to keep its value reasonably high, but this is more for image than financial reasons. People who buy and sell stock do so to make money. Anyone who "plays the market", as it is most accurately described, buys stock in the hopes that the price with rise so that they can sell it later for a profit. In other words, the company should do well enough that the share price rises so they can make money. Since the issuing company's only obligation is to increase it's share price so that others can generate income, it is not truly accurate to call the stock buyers "investors". They aren't investing in the company, they are investing in themselves. Technically, the shareholders are "owners" but for the most part they are only concerned about the share price, not the working conditions, the employees, the customers, or the products or services themselves … or only insofar as these things have a positive influence on the share price.

The stock market, then, is really more like a casino than an investment, as one chief financial officer told me. What amazes me the most, though, is the amount of media coverage this particular casino gets. Fluctuations in the stock market are more often than not market players' emotional reactions to all kinds of events, but not really a sound indication of the health of the economy. I don't think it's ever a good idea to take your temperature in a casino.

2011-11-01

Stock market 101a

In the world of business, there are three ways for an organization to generate extra cash. Extra? Yes, that is, money that is not generated through regular operations. Money the organization wants to invest. We all know that it is wiser to save up for a large purchase before buying it, but in the go-go-go, consumer-driven world today, we too often resort to credit to satisfy our impulses. Some things, like a home, of course, are really too significant a purchase to save up for, but cars and stereos and smart phones and refridgerators are in fact manageable.

The savings of business are called retained earnings, and sometimes these reserves are not enough to finance the next step forward for a business, so they have to get the money elsewhere. The three avenues open to them are, as in everyday life, to beg, borrow or steal. Really? Let me explain what I mean.

Let's start with borrowing since it is the most familiar to most of us. You go to a bank (usually) or other financial institution (could be a credit union, or Aunt Marge) and you negotiate a sum to be paid back over a specified period of time, and at a certain rate of interest. The riskier the bank feels this lending is, the higher the interest rate you end up paying. (It was once rating agencies which made such decisions, but they managed to tarnish their own reputations lately.)

The second way is to beg. Actually, the organization itself offers promisory notes (in everyday speak: IOUs) called bonds. The organization is, within certain limits of course, free to say when and how the bonds will be paid out, but there are several agreed on standards. Perhaps the most commonly known type of bond is the savings bond. When you buy a bond today for $37.00, in seven years the government promises to pay you back $50. The organization is basically saying, "trust me", and if you do, you can lend it money.

Whimsical as I am, I listed "stealing" as the third way, but that's obviously not 100% accurate. The third way of generating cash is to issue shares of stock. These shares represent ownership, so the percentage of shares you hold determines your "share" of the business. Such an issuing can be private, that is, you offer a part of your business to someone else and you negotiate between yourselves how many shares and what they are worth. We don't often hear about these kinds of transactions in the news.

So now we have the basics and we can get to the fun part tomorrow.

2011-10-29

Business 101b

Turning one's thinking around isn't really as difficult as most would make it out to be. You simply need to take a step back, move slightly to the side, and then take a new look at the old idea. Failing that, sometimes a gentle nudge from without can help. Let's see:

Peter Drucker, the grandfather of American business theory, maintained that the purpose of a business (organization) is to satisfy needs in markets. I know that last word is a red flag for some, but let's keep it simple: by market I mean any "place" (real or virtual) where value is exchanged. This can be a swap meet, a flea market, an online bartering site or a commercial or industrial trade sector, or more.

In a world as complex as ours is today, this definition of purpose easily accommodates both for-profit and not-for-profit organizations. If there is a need for, say, computers and you have a business than makes them, then you are satisfying that need. If there is a need for charitable assistance for homeless people, then if you organize to do that, you are satisfying that need. If there is a need for government oversight in some area of the economy, then that agency, by performing its function, satisfies a need. Now that everyone is accounted for, we can move on to more meaningful topics like strategy, operational efficiency, environmental effectiveness or anything else that might help us get more out of what we do ... and by "more", I certainly don't mean just money.

Profit is money, so if the primary purpose of business is profit maximization, then you are trying to maximize your money. It follows, however, that if this is your first purpose, any other purpose you have can be, at best, second or lower. And it is here that too many businesses become disingenuous and start losing credibility in many people's eyes ... and rightfully so. You can claim that "product quality" or "customer satisfaction" or "the health of the patient" is paramount, but if you believe the mantra, you are not telling the truth. Profit maximization means nothing less than money first, and everything else comes after that. The order in which they come is really not all that important, for what's important is money. Naturally this does not and cannot work for not-for-profit organizations, but over the long-term it's not good for for-profit businesses either. At some point the customer starts asking him or herself what their getting for their money. Theirs is less; the businesses is more.

I would be the last person to maintain that money isn't important. Just because it isn't the absolute top priority doesn't mean it has no priority at all or that we shouldn't care about it. Quite the contrary. It is essential for businesses to strive for sustainability, to ensure their long-term existence, not just for their owners, but for the employees, their families, the community and any other stakeholder in the business. In order to do this, it is absolutely necessary to take in more than you spend. This is just commonsense housekeeping. Making profit the measure of success, though, defeats this purpose because the needs that should be satsified by organizations become relegated to second-class status, then ignored, then forgotten.

We need a new measure of success, but we won't be able to find one till we are ready to give up on ideas that have proved themselves wrong. Profit-maximization is one of them. It's time to give another idea a chance.

2011-10-28

Business 101a

The more things change, the more they stay the same. I earned my MBA a quarter of a century ago, and I spent the last twelve years tutoring on an internationally recognized MBA program, and I can assure you, not much has changed. In my day, the MBA mantra was simple: the purpose of business is the maximization of profit. But is that really the purpose of business?

To some it is. I'd be the last person to say it wasn't so. But just because we think something is a certain way, it doesn't mean that's the best way to look at it. Truth be told, that mantra is actually a formula for failure, and that is precisely what the lastest financial crisis has shown us. We don't have to learn from our mistakes, I suppose (though I'd prefer we would), but it would be worthwhile thinking about just why it failed.

I don't like talking about businesses as if they were all one and the same thing (and regardless of what the Supreme Court thinks, they are not "people" ... an incorporated organization is technically a juridical person, meaning that in terms of the law it is in some limited ways like a person, but that does not make it a person any more than if I act like the boss that this means I actually am the boss). It is much easier, and cleaner, to think of businesses as "organizations", that is, a collection of individuals who have associated themselves with one another to do something that none of them could do on his or her own. In financial terms, though, it is possible to distinguish between two primary types of organizations: those that generate a profit (for-profit organizations) and those that don't (non- or not-for-profit organizations). These latter types of organizations can take on different forms from government departments and agencies to charities and more.

It is quite obvious that the MBA mantra doesn't -- and cannot -- apply to not-for-profit organizations. Though the MBA mantra is still being preached (with unwavering fervor), but the number of students from not-for-profit organizations seeking a graduate business degree has been increasing dramatically and steadily for the last 10 years. One has to do a lot of mental gymnastics to contort oneself enough to bring all of this together in such a way that it makes sense. Besides, it all can be so much simpler. We humans have a knack for simply getting things backwards, so perhaps it is time to simply turn our thinking around. It might take a bit of effort, but it really shouldn't hurt.

2011-10-26

Time to talk

Violence is a sign of total frustration. When someone is backed into a corner and escape seems impossible, a common response is simply violence.

It is particularly sad, however, when it is the government – be it municipal, state, or federal – which feels compelled to act this way, for they are, at least in theory, servants of the public. I know it doesn't seem like it. Politics and governance have appeared to take on a lives of their own, or at least they appear to be living in their own little world. That's another sign of being overwhelmed by reality: retreating into one's own world.

For a good number of people, this is the time to find the guilty and ensure they are punished, but that's a really bad idea. Not only is there no one single entity, institution, group or person who can be blamed. We're all to blame. As George Carlin once quipped, "You're here, you're guilty, end of story." Well, almost. This is not the time to fix the blame, it's time to start fixing the problem.

Why? Because we're so good at it? Hardly. Because we know what needs to be done? Not in the least? Because we have a good chance of coming to a consensus on the way forward? Absolutely not. No, we have to start working on the fix simply because we have no choice. We've ranted and raved, thrown things, broken lots of stuff, almost burned the place to the ground, but, as expected, none of that has helped. No, now we have to take a deep breath, get calm, look each other in the eye, and do what we've practically forgotten how to do: we've got to start talking with one another.

You'll notice I said "with" ... not "to" or "at", not "out", "down" or "up". No, with. We've got to re-learn the lost arts of listening, reflecting, considering, questioning, and discussing. None of these need to be done with complete objective coolness. Passion is allowed, but not obsession. Assertiveness is allowed, but not abuse. Vigor is allowed, but not violence. But we have to start talking again.

Discussion is not a financial or economic method, so the moment we exchanged our society for a mere economy, we took away from ourselves perhaps our most powerful human problem-solving tool. We exchanged cleverness for clubs, reason for rubber bullets, and common sense for sonic cannons.

A recent article in New Scientist [online] by David Sloan Wilson, entitled "Selfless evolution: An idea rejected" shows how even Darwin acknowledged the necessity for a group selection function in evolution, that the Spencerian notion of "the survival of the fittest" has only short-term, but never long-term benefits. And the recent financial crisis has shown -- once again -- what happens when you think short-term for too long a time. Cooperation, not competition, is the strategic approach, and societies are (or at least should be) based on cooperation. So, it's time to put the economy in its place, take back our society and start talking again. My grandkids will thank you all for it, believe me.