Tuesday, February 23, 2010

Avoiding a Digital Dark Age

Data longevity depends on both the storage medium and the ability to decipher the information

Kurt D. Bollacker

When I was a boy, I discovered a magnetic reel-to-reel audio tape recorder that my father had used to create “audio letters” to my mother while he was serving in the Vietnam War. To my delight (and his horror), I could listen to many of the old tapes he had made a decade before. Even better, I could make recordings myself and listen to them. However, all of my father’s tapes were decaying to some degree—flaking, stretching and breaking when played. It was clear that these tapes would not last forever, so I copied a few of them to new cassette tapes. While playing back the cassettes, I noticed that some of the sound quality was lost in the copying process. I wondered how many times I could make a copy before there was nothing left but a murky hiss.

A decade later in the 1980s I was in high school making backups of the hard drive of my PC onto 5-¼-inch floppy disks. I thought that because digital copies were “perfect,” and I could make perfect copies of perfect copies, I couldn’t lose my data, except by accident. I continued to believe that until years later in college, when I tried to restore my backup of 70 floppy disks onto a new PC. To my dismay, I discovered that I had lost the floppy disk containing the backup program itself, and thus could not restore my data. Some investigation revealed that the company that made the software had long since gone out of business. Requests on electronic bulletin board systems and searches on Usenet turned up nothing useful. Although all of the data on them may have survived, my disks were useless because of the proprietary encoding scheme used by my backup program.

The Dead Sea scrolls, made out of still-readable parchment and papyrus, are believed to have been created more than 2,000 years ago. Yet my barely 10-year-old digital floppy disks were essentially lost. I was furious! How had the shiny new world of digital data, which I had been taught was so superior to the old “analog” world, failed me? I wondered: Had I had simply misplaced my faith, or was I missing something?

Over the course of the 20th century and into the 21st, an increasing proportion of the information we create and use has been in the form of digital data. Many (most?) of us have given up writing messages on paper, instead adopting electronic formats, and have exchanged film-based photographic cameras for digital ones. Will those precious family photographs and letters—that is, email messages—created today survive for future generations, or will they suffer a sad fate like my backup floppy disks? It seems unavoidable that most of the data in our future will be digital, so it behooves us to understand how to manage and preserve digital data so we can avoid what some have called the “digital dark age.” This is the idea—or fear!—that if we cannot learn to explicitly save our digital data, we will lose that data and, with it, the record that future generations might use to remember and understand us.

Save Our Bits!
The general problem of data preservation is twofold. The first matter is preservation of the data itself: The physical media on which data are written must be preserved, and this media must continue to accurately hold the data that are entrusted to it. This problem is the same for analog and digital media, but unless we are careful, digital media can be more fragile.

The second part of the equation is the comprehensibility of the data. Even if the storage medium survives perfectly, it will be of no use unless we can read and understand the data on it. With most analog technologies such as photographic prints and paper text documents, one can look directly at the medium to access the information. With all digital media, a machine and software are required to read and translate the data into a human-observable and comprehensible form. If the machine or software is lost, the data are likely to be unavailable or, effectively, lost as well.

Preservation
Unlike the many venerable institutions that have for centuries refined their techniques for preserving analog data on clay, stone, ceramic or paper, we have no corresponding reservoir of historical wisdom to teach us how to save our digital data. That does not mean there is nothing to learn from the past, only that we must work a little harder to find it. We can start by briefly looking at the historical trends and advances in data representation in human history. We can also turn to nature for a few important lessons.

The earliest known human records are millennia-old physical scrapings on whatever hard materials were available. This medium was often stone, dried clay, bone, bamboo strips or even tortoise shells. These substances were very durable—indeed, some specimens have survived for more than 5,000 years. However, stone tablets were heavy and bulky, and thus not very practical.

Possibly the first big advance in data representation was the invention of papyrus in Egypt about 5,500 years ago. Paper was lighter and easier to make, and it took up considerably less space. It worked so well that paper and its variants, such as parchment and vellum, served as the primary repositories for most of the world’s information until the advent of the technological revolution of the 20th century.

Technology brought us photographic film, analog phonographic records, magnetic tapes and disks, optical recording, and a myriad of exotic, experimental and often short-lived data media. These technologies were able to represent data for which paper cannot easily be used (video, for example). The successful ones were also usually smaller, faster, cheaper and easier to use for their intended applications. In the last half of the 20th century, a large part of this advancement included a transition from analog to digital representations of data.

Even a brief investigation into a small sampling of information-storage media technologies throughout history quickly uncovers much dispute regarding how long a single piece of each type of media might survive. Such uncertainty cannot be settled without a time machine, but we can make reasonable guesses based on several sources of varying reliability. If we look at the time of invention, the estimated lifespan of a single piece of each type of media and the encoding method (analog or digital) for each type of data storage (see the table at right), we can see that new media types tend to have shorter lifespans than older ones, and digital types have shorter lifespans than analog ones. Why are these new media types less durable? Shouldn’t technology be getting better rather than worse? This mystery clamors for a little investigation.

To better understand the nature of and differences between analog and digital data encoding, let us use the example of magnetic tape, because it is one of the oldest media that has been used in both analog and digital domains. First, let’s look at the relationship between information density and data-loss risk. A standard 90-minute analog compact cassette is 0.00381 meters wide by about 129 meters long, and a typical digital audio tape (DAT) is 0.004 meters wide by 60 meters long. For audio encodings of similar quality (such as 16 bit, 44.1 kilohertz for digital, or 47.6 millimeters per second for analog), the DAT can record 500 minutes of stereo audio data per square meter of recordable surface, whereas the analog cassette can record 184 minutes per square meter. This means the DAT holds data about 2.7 times more densely than the cassette. The second table (right) gives this comparison for several common consumer audio-recording media types. Furthermore, disk technologies tend to hold data more densely than tapes, so it is no surprise that magnetic tape has all but disappeared from the consumer marketplace.

However, enhanced recording density is a double-edged sword. Assume that for each medium a square millimeter of surface is completely corrupted. Common sense tells us that media that hold more data in this square millimeter would experience more actual data loss; thus for a given amount of lost physical medium, more data will be lost from digital formats. There is a way to design digital encoding with a lower data density so as to avoid this problem, but it is not often used. Why? Cost and efficiency: It is usually cheaper to store data on digital media because of the increased density.

A possibly more important difference between digital and analog media comes from the intrinsic techniques that comprise their data representations. Analog is simply that—a physical analog of the data recorded. In the case of analog audio recordings on tape, the amplitude of the audio signal is represented as an amplitude in the magnetization of a point on the tape. If the tape is damaged, we hear a distortion, or “noise,” in the signal as it is played back. In general, the worse the damage, the worse the noise, but it is a smooth transition known as graceful degradation. This is a common property of a system that exhibits fault tolerance, so that partial failure of a system does not mean total failure.

Unlike in the analog world, digital data representations do not inherently degrade gracefully, because digital encoding methods represent data as a string of binary digits (“bits”). In all digital symbol number systems, some digits are worth more than others. A common digital encoding mechanism, pulse code modulation (PCM), represents the total amplitude value of an audio signal as a binary number, so damage to a random bit causes an unpredictable amount of actual damage to the signal.

Let’s use software to concoct a simulated experiment that demonstrates this difference. We will compare analog and PCM encoding responses to random damage to a theoretically perfect audiotape and playback system. The first graph in the third figure (above) shows analog and PCM representations of a single audio tone, represented as a simple sine wave. In our perfect system, the original audio source signal is identical to the analog encoding. The PCM encoding has a stepped shape showing what is known as quantization error, which results from turning a continuous analog signal into a discrete digital signal. This class of error is usually imperceptible in a well-designed system, so we will ignore it for now.

For our comparison, we then randomly damage one-eighth of the simulated perfect tape so that the damaged parts have a random amplitude response. The second graph in the third figure (above) shows the effect of the damage on the analog and digital encoding schemes. We use a common device called a low-pass filter to help minimize the effect of the damage on our simulated output. Comparing the original undamaged audio signal to the reconstructions of the damaged analog and digital signals shows that, although both the analog and digital recordings are distorted, the digital recording has wilder swings and higher error peaks than the analog one.

But digital media are supposed to be better, so what’s wrong here? The answer is that analog data-encoding techniques are intrinsically more robust in cases of media damage than are naive digital-encoding schemes because of their inherent redundancy—there’s more to them, because they’re continuous signals. That does not mean digital encodings are worse; rather, it’s just that we have to do more work to build a better system. Luckily, that is not too hard. A very common way to do this is to use a binary-number representation that does not mind if a few bits are missing or broken.

One important example where this technique is used is known as an error correcting code (ECC). A commonly used ECC is the U.S. Postal Service’s POSTNET (Postal Numeric Encoding Technique), which represents ZIP codes on the front of posted envelopes. In this scheme, each decimal digit is represented as five binary digits, shown as long or short printed bars (right). If any single bar for any decimal digit were missing or incorrect, the representation would still not be confused with that of any other digit. For example, in the rightmost column of the table, the middle bar for each number has been erased, yet none of the numbers is mistakable for any of the others.

Although there are limits to any specific ECC, in general, any digital- encoding scheme can be made as robust as desired against random errors by choosing an appropriate ECC. This is a basic result from the field of information theory, pioneered by Claude Shannon in the middle of the 20th century. However, whichever ECC we choose, there is an economic tradeoff: More redundancy usually means less efficiency.

Nature can also serve as a guide to the preservation of digital data. The digital data represented in the DNA of living creatures is copied into descendents, with only very rare errors when they reproduce. Bad copies (with destructive mutations) do not tend to survive. Similarly, we can copy digital data from medium to medium with very little or no error over a large number of generations. We can use easy and effective techniques to see whether a copy has errors, and if so, we can make another copy. For instance, a common error-catching program is called a checksum function: The algorithm breaks the data into binary numbers of arbitrary length and then adds them in some fashion to create a total, which can be compared to the total in the copied data. If the totals don’t match, there was likely an accidental error in copying. Error-free copying is not possible with analog data: Each generation of copies is worse than the one before, as I learned from my father’s reel-to-reel audiotapes.

Because any single piece of digital media tends to have a relatively short lifetime, we will have to make copies far more often than has been historically required of analog media. Like species in nature, a copy of data that is more easily “reproduced” before it dies makes the data more likely to survive. This notion of data promiscuousness is helpful in thinking about preserving our own data. As an example, compare storage on a typical PC hard drive to that of a magnetic tape. Typically, hard drives are installed in a PC and used frequently until they die or are replaced. Tapes are usually written to only a few times (often as a backup, ironically) and then placed on a shelf. If a hard drive starts to fail, the user is likely to notice and can quickly make a copy. If a tape on a shelf starts to die, there is no easy way for the user to know, so very often the data on the tape perishes silently, likely to the future disappointment of the user.

Comprehensibility
In the 1960s, NASA launched Lunar Orbiter 1, which took breathtaking, famous photographs of the Earth juxtaposed with the Moon. In their rush to get astronauts to the Moon, NASA engineers created a mountain of magnetic tapes containing these important digital images and other space-mission-related data. However, only a specific, rare model of tape drive made for the U.S. military could read these tapes, and at the time (the 1970s to 1980s), NASA had no interest in keeping even one compatible drive in good repair. A heroic NASA archivist kept several donated broken tape drives in her garage for two decades until she was able to gain enough public interest to find experts to repair the drives and help her recover these images.

Contrast this with the opposite problem of the analog Phaistos Disk (above right), which was created some 3,500 years ago and is still in excellent physical condition. All of the data it stores (about 1,300 bits) have been preserved and are easily visible to the human eye. However, this disk shares one unfortunate characteristic with my set of 20-year-old floppy disks: No one can decipher the data on either one. The language in which the Phaistos disk was written has long since been forgotten, just like the software to read my floppies is equally irretrievable.

These two examples demonstrate digital data preservation’s other challenge—comprehensibility. In order to survive, digital data must be understandable by both the machine reading them and the software interpreting them. Luckily, the short lifetime of digital media has forced us to gain some experience in solving this problem—the silver lining of the dark clouds of a looming potential digital dark age. There are at least two effective approaches: choosing data representation technologies wisely and creating mechanisms to reach backward in time from the future.

Make Good Choices …
In order to make sure digital data can be understood in the future, ideally we should choose representations for our data for which compatible hardware and software are likely to survive as well. Like species in nature, digital formats that are able to adapt to new environments and threats will tend to survive. Nature cannot predict the future, but the mechanism of mutation creates different species with different traits, and the fittest prevail.

Because we also can’t predict the future to know the best data-representation choices, we try to do as nature does. We can copy our digital data into as many different media, formats and encodings as possible and hope that some survive.

Another way to make good choices is to simply follow the pack. A famous example comes from the 1970s, when two competing standards for home video recording existed: Betamax and VHS. Although Betamax, by many technical measures, was a superior standard and was introduced first, the companies supporting VHS had better business and marketing strategies and eventually won the standards war. Betamax mostly fell into disuse by the late 1980s; VHS survived until the mid-2000s. Thus if a format or media standard is in more common use, it may be a better choice than one that is rare.

… Or Fake It!
Once we’ve thrown the dice on our data-representation choices, is there anything else we can do? We can hope we will not be stuck for decades, like our NASA archivist, or left with a perfectly readable but incomprehensible Phaistos disk. But what if our scattershot strategy of data representation fails, and we can’t read or understand our data with modern hardware and software? A very common approach is to fake it!

If we have old digital media for which no compatible hardware still exists, modern devices sometimes can be substituted. For example, cheap and ubiquitous optical scanners have been commonly used to read old 80-column IBM punchcards. This output solves half of the problem, leaving us with the task of finding hardware to run the software and interpret the data that we are again able to read.

In the late 1950s IBM introduced the IBM 709 computer as a replacement for the older model IBM 704. The many technical improvements in the 709 made it unable to directly run software written for the 704. Because customers did not want either to lose their investment in the old software or to forgo new technological advances, IBM sold what they called an emulator module for the 709, which allowed it to pretend to be a 704 for the purposes of running the old software. Emulation is now a common technique used to run old software on new hardware. It does, however, have a problem of recursion—what happens when there is no longer compatible hardware to run the emulator itself? Emulators can by layered like Matryoshka dolls, one running inside another running inside another.

Being Practical
Given all of this varied advice, what can we do to save our personal digital data? First and foremost, make regular backup copies onto easily copied media (such as hard drives) and place these copies in different locations. Try reading documents, photos and other media whenever upgrading software or hardware, and convert them to new formats as needed. Lastly, if possible, print out highly important items and store them safely—there seems to be no getting away from occasionally reverting to this “outdated” media type. None of these steps will guarantee the data’s survival, but not taking them almost guarantees that the data will be lost, sooner or later. This process does seem to involve a lot more effort than my grandparents went to when shoving photos into a shoebox in the attic decades ago, but perhaps this is one of the costs for the miracles of our digital age.

If all this seems like too much work, there is one last possibility. We could revert our digital data back to an analog form and use traditional media-preservation techniques. An extreme example of this is demonstrated by the Rosetta Project, a scholarly endeavor to preserve parallel texts of all of the world’s written languages. The project has created a metal disk (right) on which miniaturized versions of more than 13,000 pages of text and images have been etched using techniques similar to computer-chip lithography. It is expected that this disk could last up to 2,000 years because, physically, the disk has more in common with a stone tablet than a modern hard drive. Although this approach should work for some important data, it is much more expensive to use in the short term than almost any practical digital solution and is less capable in some cases (for example, it’s not good for audio or video). Perhaps it is better thought of as a cautionary example of what our future might look like if we are not able to make the digital world in which we find ourselves remain successful over time.

Friday, February 12, 2010

Incredible 'Real' Reason for Carbon Trading?

Critics who think that the U.S. dollar will be replaced by some new global currency are perhaps thinking too small. On the world horizon looms a new global currency that could replace all paper currencies and the economic system upon which they are based. The new currency, simply called Carbon Currency, is designed to support a revolutionary new economic system based on energy (production, and consumption), instead of price. Our current price-based economic system and its related currencies that have supported capitalism, socialism, fascism and communism, is being herded to the slaughterhouse in order to make way for a new carbon-based world. It is plainly evident that the world is laboring under a dying system of price-based economics as evidenced by the rapid decline of paper currencies. The era of fiat (irredeemable paper currency) was introduced in 1971 when President Richard Nixon decoupled the U.S. dollar from gold. Because the dollar-turned-fiat was the world's primary reserve asset, all other currencies eventually followed suit, leaving us today with a global sea of paper that is increasingly undesired, unstable, unusable. The deathly economic state of today's world is a direct reflection of the sum of its sick and dying currencies, but this could soon change. - The August Review (Carbon Currency: A New Beginning for Technocracy?)

Dominant Social Theme: Green is good.

Free-Market Analysis: We tend to analyze articles that appear in the mainstream press but regular readers know that the Bell will make an exception from time to time. And in this case, we have. The paper we have alluded to, (above, excerpted) seems to reveal details about the power elite's REAL agenda as regards global warming and carbon trading. While some of the information alluded to in the article has come out already in serial reports, we think the way the August Review has pulled it together and synthesized the information may be seen as both original and important.

In fact, the mind-blowing report that the Review is presenting today on its website (for the first time anywhere) sounds credible to us, understanding as we think we do, the memes of the power elite and the reason for their promotion. Click here to read Carbon Currency: A New Beginning for Technocracy?

We are not surprised by the quality, generally, of the Review's publications. The August Review is a "global elite research center." The tone of its analysis is often scholarly and its articles - while frank - seem to place a priority on research over opinion. Here's more on the August Review from the site itself:

The August Review is an exclusive Internet-based publication of the editor, Patrick M. Wood, and focuses on the Trilateral Commission, its members and activities. The research "juggernaut" that was created by Wood and Antony Sutton to study the Trilateral Commission has been enhanced using various professional sources now available on the Internet. This editor is committed to performing original and innovative research, as opposed to re-hashing second hand or opinionated writings of other news services or commentators. The August Review also monitors the press for news stories relating to members, policies and meetings of the Trilateral Commission.

The emphasis on making carbon an environmental bogeyman makes sense within a context of power elite promotions. The elite creates fear-based dominant social themes to frighten people into giving up wealth and power to authoritarian solutions that have also not-so-coincidentally been created by the elite. Regulate carbon and you basically have a way to monitor and control people's entire lifestyles, or certainly the part that involves the use of oil, gas, etc. That water vapor is responsible for trapping the majority of greenhouse gases doesn't enter into the equation - because the environmental movement in its later stages is not about reacting to environmental problems but about creating more power and wealth for the handful of families and individuals that create these promotions.

Here's some more from the new August Review white paper:

The modern system of carbon credits was an invention of the Kyoto Protocol and started to gain momentum in 2002 with the establishment of the first domestic economy-wide trading scheme in the U.K. After becoming international law in 2005, the trading market is now predicted to reach $3 trillion by 2020 or earlier.

Graciela Chichilnisky, director of the Columbia Consortium for Risk Management and a designer of the carbon credit text of the Kyoto Protocol, states that the carbon market "is therefore all about cash and trading - but it is also a way to a profitable and greener future." (See Who Needs a Carbon Market?)

Who are the "traders" that provide the open door to all this profit? Currently leading the pack are JPMorgan Chase, Goldman Sachs and Morgan Stanley. ...

Whoever controls the currency also controls the economy and the political structure that goes with it ... Technocracy and energy-based accounting are not idle or theoretical issues. If the global elite intends for Carbon Currency to supplant national currencies, then the world economic and political systems will also be fundamentally changed forever.

Considering the sheer force of global banking giants behind carbon trading, it's no wonder analysts are already predicting that the carbon market will soon dwarf all other commodities trading.

The August Review is a foremost authority on the Trilateral Commission. Here at the Bell we certainly believe that such private groups are damaging to free-enterprise because they inevitably seek to involve government power and to use the force of law for private ends. Such a system is called mercantilism and it is a true curse of modern Western societies.

It is the Bell's contention that the mercantilist destruction wrought by the power elite's dominant social themes has possibly met its match in the 21st century thanks to the Internet. It is the longtime contention of the editors of the Bell (for nearly a decade now in various publishing incarnations) that the Internet has been undermining the entire promotional program of the power elite and that the elite's memes would meet increased resistance as the Internet's influence grew. In fact, we believe this is taking place.

We have utilized the impact of the Gutenberg press on society as a historical reference point when making the case that the power elite will have to take "a step back" as it did before when confronted with the unique challenges of a major communications revolution. In fact, we are not impressed with arguments that because US military agency DARPA invented what became the Internet, the power elite expected and anticipated what the Internet has become. In fact, it was the invention of the floppy disc and personal computer that created the phenomenon of the Internet, and this was the result of private enterprise and could not have been easily anticipated.

The August Review's presentation of "Carbon Currency: A New Beginning for Technocracy?" may be seen, from the above perspective, as a further example of how the Internet is causing headaches for the power elite and its banking and financial instrumentalities. Once a concept is understood and transmitted throughout the Internet, plenty of readers take advantage of the information and elaborate on it in their own way. This will happen, we are confident with the revelations contained in "Carbon Currency: A New Beginning for Technocracy?" (It could thus mark an "end" rather than a new beginning, or certainly slow the momentum.)

Power elite promotions rely on secrecy and a sense of inevitability. But in the case of global warming, the promotion has been greatly damaged. It was the Internet that made possible the exposure of the reprehensible emails that showed a conspiracy to defraud as regard the impacts (and even the existence) of global warming. It was the Internet that provided people with a way to organize against environmental fascism. And now it is the Internet, in our opinion, that is exposing the further scheming that lies at the HEART of the power elite's promotional efforts as regards this horrid dominant social theme.

We are aware of the growing argument among certain observers of the alternative press that the exposure of power elite themes must be part of a larger plan. But if so, why didn't the same phenomenon occur in the 20th century when power elite promotions were at their peak and most powerful? The answer of course is the Internet. Every power elite meme from the war on terror, to global warming and central banking is under powerful attack these days. We can't imagine that this is a desirable outcome from the point of view of those involved with their implementation.

It may be that the power elite will begin "exposing" its own promotions in order to gain some advantage from Internet revelations. But we have difficulty believing that such an exercise will pay sufficient dividends to make up for the current destruction of its dominant social themes, which likely have to be rebuilt from the ground up - once the Internet is tamed (and when will that be?).

As a final aside, we are gratified that in this white paper, the August Review also deals with the fraud of peak oil. We are entirely unsurprised that Technocrat M. King Hubbert and his economically illiterate energy concepts manage to slither into the middle of the story that the August Review has to tell. The idea that the market itself would not (and somehow could not) respond to peak oil with new stores of energy is yet another power elite promotion. (Meme: Only government authorities, including the UN, can properly plan energy replacements!)

Conclusion: The August Review's presentation of the apparent planning and purpose behind the carbon scam is yet a further proof of the power of the Internet in our opinion. Between the emails revealing the conspiracy and the more recent revelations of phony research and false numbers (and the Review's seemingly accurate revelations as to where all this is really headed) we think the global warming meme is under extreme duress. Sure, it may stagger along - that's one of the hallmarks of a power elite promotion (it won't die no matter how many holes are shot into it) - but it's very hard to promote a theme or meme that has been discredited. And boy is it being discredited.

http://abeldanger.blogspot.com/2010/02/new-economic-system-based-on-carbon.html

Sunday, February 7, 2010

USA Default is Now Certain

Below is a quotation from:

http://www.silverbearcafe.com/private/02.10/bankrupt.html

"When governments go bankrupt it's called "a default." Currency speculators figured out how to accurately predict when a country would default. Two well-known economists - Alan Greenspan and Pablo Guidotti - published the secret formula in a 1999 academic paper. That's why the formula is called the Greenspan-Guidotti rule. The rule states: To avoid a default, countries should maintain hard currency reserves equal to at least 100% of their short-term foreign debt maturities. The world's largest money management firm, PIMCO, explains the rule this way: "The minimum benchmark of reserves equal to at least 100% of short-term external debt is known as the Greenspan-Guidotti rule. Greenspan-Guidotti is perhaps the single concept of reserve adequacy that has the most adherents and empirical support."

The principle behind the rule is simple. If you can't pay off all of your foreign debts in the next 12 months, you're a terrible credit risk. Speculators are going to target your bonds and your currency, making it impossible to refinance your debts. A default is assured.

So how does America rank on the Greenspan-Guidotti scale? It's a guaranteed default. . . .

According to the U.S. Treasury, $2 trillion worth of debt will mature in the next 12 months. So looking only at short-term debt, we know the Treasury will have to finance at least $2 trillion worth of maturing debt in the next 12 months. That might not cause a crisis if we were still funding our national debt internally. But since 1985, we've been a net debtor to the world. Today, foreigners own 44% of all our debts, which means we owe foreign creditors at least $880 billion in the next 12 months - an amount far larger than our reserves.

Keep in mind, this only covers our existing debts. The Office of Management and Budget is predicting a $1.5 trillion budget deficit over the next year. That puts our total funding requirements on the order of $3.5 trillion over the next 12 months.

So... where will the money come from? Total domestic savings in the U.S. are only around $600 billion annually. Even if we all put every penny of our savings into U.S. Treasury debt, we're still going to come up nearly $3 trillion short. That's an annual funding requirement equal to roughly 40% of GDP. Where is the money going to come from? From our foreign creditors? Not according to Greenspan-Guidotti. And not according to the Indian or the Russian central bank, which have stopped buying Treasury bills and begun to buy enormous amounts of gold. The Indians bought 200 metric tonnes this month. Sources in Russia say the central bank there will double its gold reserves."

Official government-manipulated figures show that the unemployment rate dropped in January from 10% down to 9.7%. At the same time, the report shows that 20,000 more jobs were lost in the same month.

I'm not sure how they reconcile those figures, unless a lot of people are now working two jobs.

The unemployment figures do not really tell us how many unemployed people there are. It only tells us how many are receiving unemployment checks. When those checks cease, those who remain unemployed are re-classified as "unemployable" or "discouraged." It is assumed that they have stopped looking for work, and so they are not longer unemployed!

Last December the government originally reported job losses at 85,000, but they have now revised those figures to 150,000. Figures for January show another 20,000 jobs slashed from payrolls. And yet the unemployment rate went down to 9.7%. Go figure.

". . . the figures for December were revised to show 150,000 jobs were slashed from payrolls, instead of the 85,000 job cuts first reported."

http://www.usnews.com/money/careers/articles/2010/02/05/what-a-97-percent-unemployment-rate-means.html?PageNr=2

From the same article above, we read,

Morgan Stanley's Wieseman and Greenlaw said they are getting "closer to calling the peak in the unemployment rate." Still, the unemployment rate could bounce higher again if workers who dropped out of the job hunt are encouraged enough to jump back in.

In other words, if a lot of people believe that the job market is opening up again, and they actually start looking for work again, then the unemployment figures will "bounce higher again." Why? Simply because the official figures do not really count all the unemployed; they merely count those who are actively looking for work.

Trouble in the Eurozone

The recent default of Dubai World is having a huge but hidden effect upon the rest of the world. Dubai is probably the main source of the derivatives market, and so their default will affect many other countries.

Then there is Greece, Portugal, and Spain, all of which are in or near default. They call this "sovereign default," when nations default on their loan payments.

America, too, is going to have to cough up $3 Trillion in payments on its short-term debt this year. There is not enough production in the entire economy to do that. The Fed will have to default or create still more trillions out of thin air.

This economic problem is not going away, nor can it be resolved by throwing more new money at it. New money merely creates more debt. While that policy may solve the problem in the short term, it only increases the problem long term. Of course, they are hoping that postponing the problem will give the economy time to turn around before the ship hits the ice berg.

Most nations have been postponing the problem by quick fixes and by "stimuluus" programs that only make things worse long term. No one recognizes the 10-ton elephant in the living room. It is the fact that the Treasury must BORROW money from the Fed in order to maintain a supply of money in circulation, instead of issuing Treasury Notes that are debt free.

Until we repeal the Federal Reserve Act and give back the power to create money to the Congress, the problem will only get worse, the private bankers will only get richer, and the people will be left impoverished in their own land.