Thursday, August 30, 2007

I Wondered How Long It Would Take

Abusing Animals While Black

In the Atlanta Journal-Constitution, one Kathy Rudy, an associate professor of "women's studies" at Duke University, weighs in on, as the headline puts it, "White Culture's Hypocrisy About Vick." That would be Michael Vick, the erstwhile NFL player who pleaded guilty to conspiracy earlier this week in a case involving illegal dogfighting.

Ms. Rudy, who is also an "ethicist," describes herself as "a strong advocate of animal welfare." Nonetheless, she says, "I find what's happening with Vick . . . alarming":

"We need to face the fact that dog fighting is not the only "sport" that abuses animals. Cruelty also occurs in rodeos, horse and dog racing (all of which mistreat animals and often kill them when no longer useful). There are also millions of dogs and cats we put to death in "shelters" across the country because they lack a home, and billions of creatures we torture in factory farms for our food.

"Vick treated his dogs very cruelly; there is no question about that. But I see one important difference between these more socially acceptable mistreatments and the anger focused on Vick: Vick is black, and most of the folks in charge of the other activities are white.

Ms. Rudy then goes on to make the following distinction:

"While white middle and upper classes continue to watch horses run to the point of exhaustion and risk breaking their legs, they regard dogfighting as something that only low-class "thugs and drug dealers" find entertaining. Indeed, a reading of many of the Vick news stories indicts him and his friends as much for being involved in hip-hop subculture as for fighting dogs. . . .

"I am not saying dogfighting is acceptable, but rather that Vick should be publicly criticized for that activity, not for his participation in hip-hop subculture. Whether or not dogs are fought more by minorities than white people is actually unknown, but the media representations of the last several weeks make it appear that black culture and dogfighting are inextricably intertwined. We need to find ways to condemn dogfighting without denigrating black culture with it. "

Huh? Denigrate means "to blacken," so Rudy is opposed to the blackening of black culture. That's the kind of intellectual rigor we've come to expect from the women's studies department at Duke.

But wait. Assuming she meant something like "disparaging" instead of "blackening," isn't she the one who's guilty of that? Blacks have far worthier contributions to America than the "hip-hop subculture" with its "thugs and drug dealers." It is patronizing at best, racist at worst, to equate "black culture" with what Stanley Crouch has called "the most dehumanizing images of black people since the dawn of minstrelsy in the 19th century."

Ms. Rudy writes:

"I would like to believe that in 25 years we're going to look back on our current treatment of many animals as cruel and intolerable, and I do believe that the welfare of animals is coming into focus as the next great social movement in this country. Civil rights, feminism, gay and lesbian rights, and the Latino movement have transformed American life for the better. I think that can--and should--happen for animals. "

But if the "hip-hop subculture" is the apogee of "black culture," why should we expect that this transformational movement of animals will produce anything better than a dog-eat-dog world?

Tuesday, July 31, 2007

The Truth (about the iPhone) is Coming Out!

A month of use, and iPhone's not as cool
By DWIGHT SILVERMANCopyright 2007 Houston Chronicle

Even before you could buy one, the word on the iPhone was that it might be the Holy Grail of wireless devices. It was so highly praised that some took to calling it the "Jesus Phone."
The early reviews were almost fawning. Oh sure, they said, the iPhone has a few flaws, but hey! Look at the big, button-free screen! The cool Google Maps application! See how Web pages look like they should on it! And how, when you turn it sideways, pictures and Web pages rotate! And how you can use touch to move from photo to photo! And it's a great iPod, too!

Yeah, those other reviews were pretty breathless. Well, it's time to get a grip, kids, because this is not going to be one of those reviews.

The iPhone is a very different device, and when you first start working with it, there's definitely an OMG! effect. Its most in-your-face feature is its Cool Factor, which is what you'd expect, since this is from Apple. I wrote about this in an earlier column about Apple's brilliant marketing of the iPhone.

Yet there's quite a difference between being wowed by previously unseen features and using and relying on a device like the iPhone day to day. For some folks, its core features — which are slick on the surface — may be adequate. But if you're someone who relies heavily on a portable device for business e-mail and even creation of content, you're going to be frustrated.

I lived with the iPhone for about a month, and as an experiment, I carried both it and my Samsung BlackJack, my own PDA. My goal was to see which device I preferred for which tasks. For example, when I wanted to access the Web online, or check e-mail, which would I reach for first?

I started out using the iPhone more, because using it was an adventure. But by the end of my experiment, I was back to using the BlackJack for most serious tasks.

While the iPhone is indeed a very cool device, and there's a lot about it to like, I think its shortcomings are major.

Here's where I think the iPhone falls down:

E-mail. If your company uses Microsoft Exchange for its e-mail (and many do), you can forget about using the iPhone to get your business mail unless your systems folks are willing to turn on an older e-mail feature called IMAP. Many system administrators won't do this (including those at the Chronicle), leaving users in the lurch. I had to use the Web-based version, but the iPhone's Safari browser didn't get along with it (more on that later). And even if I had been able to get business e-mail, the iPhone's e-mail application is pretty to look at but frustrating to use. You can't batch-delete e-mails. The iPhone has no cut/paste capability, which makes sending examples of things found on the Web impossible. And its e-mail isn't of the "push" variety, that comes at you in near real-time. It checks e-mail every 15 minutes or so and has no easy way to alert you of something new.

Web browsing. The iPhone's Safari browser is one of its strengths, but it's also a big weakness. Yes, it displays most Web pages better than any other hand-held device browser. But Safari is notorious among Web developers for its glitches in the way it handles coding such as Javascript. For example, I was unable to edit or compose in chron.com's blogging software — Movable Type, a popular platform — because scroll bars for the composition windows were missing. When I tried to write an e-mail in our Outlook Web Access page, I saw the text, but it was often blank for recipients.

Web pages must be designed a certain way for Safari's cool zoom feature to work properly, but many out there aren't. As a result, when I zoomed in, I often had to do a lot of side scrolling to read it, or use the iPhone's two-fingered "pinch" motion to reduce the size of the page. That got old quickly.

Connectivity. The iPhone can connect to the Internet two ways: using Wi-Fi or AT&T's Edge data network. If you can get to Wi-Fi with the iPhone, you'll want to do it, even if you have to beg, borrow or steal, because the Edge network is incredibly slow. I gave up on trying to look at most Web pages when Wi-Fi wasn't available; it was too painful.

In addition, the iPhone has Bluetooth capabilities for connecting other devices wirelessly, but there's only one thing that it will pair with — a headset for hands-free talking. I've got a Bluetooth Apple keyboard, which would help with the next issue on my list of gripes, but the iPhone wouldn't see it.

Virtual keyboard. One of the iPhone's most vaunted features is its virtual onscreen keyboard. It's also been its most criticized. As you tap letters on the screen, the keys enlarge, helping you to hit them more accurately ... at least, in theory. While I got a little better over time, I never could get as fast on it as I am on my BlackJack. And the iPhone's feature that predicts what word you're trying to type is nowhere near as good as the T9 found in most other smart phones.

Memory. The iPhone comes with two memory capacities: 4 and 8 gigabytes. That's not a lot of memory for a $500 or $600 device, respectively. My video iPod has a 20-GB hard drive, and I've got about 14 GB of music on it. If I want to bring my complete music collection with me, I've got to tote both my iPhone and my iPod. Sorry, but for $600, I should be able to cram it all in one device.

There are other deficiencies, from the ability to use your own music as custom ringtones (you can't even buy new ringtones) to its recessed headphone jack that won't work with most car adapters and non-Apple headsets; to its sealed, send-it-in-to-Apple-for-replacement battery.
Some shortcomings could be fixed by software upgrades, but the key word there is "could." I'm not sure I'd want to spend $600 betting on the outcome.

If you weren't one of the early possessors, but you're considering buying one, wait for the next version. The iPhone has a lot of potential, and it will surely influence what other phone manufacturers do. But for now, you're better off using something else if you're serious about getting your data on the go.

dwight.silverman@chron.com

Wednesday, July 25, 2007

iPhone: Not as bad as I thought!

But it's still over-hyped

After Apple and AT&T reported quarterly earnings this week, the truth about the iPhone has started to leak out. On Tuesday, AT&T revealed that only 156,000 phones were sold the first weekend, far less than the 500,000 to 1,000,000 that had been "forecasted" by the usual crowd of Apple groupies.

More importantly, they let slip the fact that another version of the iPhone is expected before Christmas! This is serious because one iPhone cannot be upgraded to another, newer model. It's already affecting in-store sales, which have been slumping steadily. Inventories are flush at many stores, and you'll see price cuts or other promotions before long.

In the meantime, users are finding out the severe weaknesses of the AT&T "Edge" network, which is a pathetic dial-up contraption. The only way anyone actually uses their iPhone now is with a connection to a local wireless network. But the new model will provide a high-speed connection; you just have to shell out another $500 for it.

Now for the good news: the damn thing is pretty slick. Everybody likes its ergonomics, it has a nice high-quality feel, although a bit heavy, and it's only real weakness is its lack of connectivity. Once that gets sorted out, the iPhone will take off, I'm sure. It's simply a matter of when its competitors begin arriving, and how aggressive their pricing will be. I predict that the new, 3G iPhone will be the hit product of the Christmas season.

Monday, July 02, 2007

The Jesus Phone

After two weeks of travel, I returned to base only to find out that the entire Western World was all a-twitter about the debut of Apple's iPhone. I have to hand it to the marketing mavens at Apple; they really know how to gin up hype. Of course, the key is not to let anybody actually use the product before its release. This permitted people to imagine that the iPhone would actually be better than anything out there; sort of a savior from the cell-phone hell that we all have had to endure for the past ten years. Kinda like Jesus Christ. (Hence the title, which I borrowed from the Friday Wall Street Journal article that critiqued the launch.)

I am going to go out on a limb, here, and say that the iPhone is going to disappoint all but the most rabid (and most affluent) Apple bigots. Why, you might ask? Well, it's really simple: the iPhone is going to be maddeningly slow! Especially when it comes to surfing the Web. The sexy ad copy notwithstanding, the iPhone is significant only because of its operating system, which is important, but mostly to gear-heads and tech nerds. To many end-users, the thing is going to lose its luster very quickly. It'll be like finding out that your $45,000 Porsche can't go over 55 mph!

First of all, it's no better as a phone than anything else sold at Cingular or AT&T stores, which is to say that it's slightly below average since they market some of the worst crap around. In addition, AT&T can only offer the iPhone on its low-power, low-speed, low-grade network, the "Edge" network. You heard me; the iPhone is not 3G or GSM-enabled! Most of the "old fashioned" phones they've been selling at Cingular are way faster! The only hope an iPhone user has of getting decent response time for e-mail or web use is by connecting to a local wireless hot-spot.

All of this is true because the geniuses at Apple want to wait for a year or so before introducing a truly up-to-date phone. Just like they did with the Mac back in 1984, when the first generation Macintosh was shipped with a 3.5" disk instead of a hard drive! Two years later, Apple included the hard drive in its second generation machine. But they had accomplished their real mission, which was to get the Graphical User Interface (known in the PC world as Windows) out before it could be upstaged by Microsoft.

So, look for all kinds of raves from the usual quarters, followed by serious questions from people who don't drink the Silicon Valley Kool-aid served by Steve Jobs and his minions!

Tuesday, June 12, 2007

A National Security Approach to Immigration

How to Revive the Immigration Bill
By Robert Spencer
FrontPageMagazine.com June 12, 2007

With President Bush lobbying Congress to revive the defeated and disastrous immigration bill, authorities have a chance to recast the bill in a way that takes adequate measure of the national security implications of the immigration issue. And now, Lebanon, a nation currently under attack from al-Qaeda-linked terrorists, has shown the way with a measure that the President and Congress would do well to consider adapting for the United States. In an attempt to prevent jihadists from entering Lebanon from neighboring countries, the Lebanese Foreign Ministry and the General Security Department may stop giving entrants from Arab countries automatic entry visas; instead, they would have to apply at Lebanese missions in their native countries – allowing Lebanese officials time to scrutinize their applications and try to determine whether they are involved in jihad activity.


Such a proposal has a great deal to recommend it. Lebanon is treating immigration as a national security issue, as it manifestly is not only for Lebanon, but for the U.S. as well. With refreshing directness, Lebanese officials are considering heading off the problem at its source, or one of its sources, by restricting entry into the country from Arab countries from which jihadists come. Likewise the U.S. also could, and should, institute restrictions on immigration from Muslim countries. This issue has been clouded by national traumas about “racism,” but in fact it has nothing to do with racism, as jihadists with blonde hair and blue eyes are just as lethal, and should be just as unwelcome, as jihadists with dark skin, this is about taking prudent steps to protect ourselves and defend our nation. It is only a matter of common sense to recognize where the great majority of jihadists come from, and act accordingly.



Officials should proclaim a moratorium on all visa applications from Muslim countries, since there is no reliable way for American authorities to distinguish jihadists and potential jihadists from peaceful Muslims. Because this is not a racial issue, these restrictions should not apply to Christians and other non-Muslim citizens of those countries, although all should be subjected to reasonable scrutiny. Those who claim that such a measure is “Islamophobic” should be prepared to provide a workable way for immigration officials to distinguish jihadists from peaceful Muslims, or, if they cannot do so, should not impede basic steps the U.S. should take to protect itself. And Muslims entering from anywhere -- Britain, France -- should be questioned as to their adherence to Sharia and Islamic supremacism. This is not because anyone will expect honest answers, but so that answers proven false by the applicant’s subsequent activity can become grounds for deportation.



Meanwhile, this is not just an immigration problem. The Fort Dix and JFK Airport jihad terror plots uncovered in recent weeks not only underscore the need to fix our broken immigration policies, but they show the need also to deal with the fact that jihadists are already in the country. When twenty-six percent of Muslims in the United States who are under the age of thirty approve of suicide attacks in some circumstances, and two such attacks are uncovered in the last month, this is not an abstract problem. Islamic organizations in the U.S. who refuse to renounce and teach against political Islam should be reclassified as political organizations and made subject to all the controls and scrutiny to which political organizations are subject. And here again, words must be backed by deeds, or can justly be regarded with suspicion.



If national security were our priority, these proposals would not even be controversial. Nor would Islamic advocacy groups in the U.S., if national security were their priority, oppose them either. In fact, they might spur those groups to become more energetic in rooting out jihadists from among their ranks, and from among the Muslim community in America in general. Instead of the platitudes and half-measures we have seen up to now, along with active opposition to anti-terror efforts, we might see them take genuine steps to declare the ideology of jihad and Islamic supremacism beyond the pale of American Islam, and renounce political Islam and any intention, now or in the future, to replace the U.S. Constitution with Islamic Sharia law.



But instead, the national debate still degenerates all too easily into charges of “racism,” while the real national security issues involved in immigration are shunted aside. A time may come, all too soon, when the American people will wish they had not for so long indulged this luxury. The President and Congress have a chance now to take up the immigration debate anew, and to think like statesmen, not like politicians. A realistic look at immigration as a national security matter would be a good place to start.

Thursday, May 24, 2007

Playground Lessons of Life

My Momma didn’t teach me much;
I learned everything on the playground.


1. The bigger kids run things;

2. You need to know the rules of any game to win; life is a game;

3. If you can help make the rules, you will be more successful;

4. You don’t always get picked to play; learn to entertain yourself;

5. Life’s a lot more fun if you have a buddy;

6. Girls aren’t as big or strong or as fast as boys; sometimes they’re smarter;

7. Avoid playing a position you play poorly; for that matter, avoid games you play poorly;

8. You learn better by doing something, not by watching someone else;

9. When you pick a team, your first choice makes a huge difference;

10. Some people aren’t as smart as you are, but some are a lot smarter; be their friend;

11. When you promise to do something, do it; you earn people’s trust by keeping your word; the only people to trust are the ones that keep theirs;

12. Only your mother likes to hear you cry; don’t waste your time on the playground;

13. Telling the truth is easier than lying; your memory isn’t good enough to lie well;

14. Always know the way to get home by yourself; you may have to, sometimes;

15. Your reputation is the only thing you carry from year to year; if it’s a good one, you have to live up to it. If it’s a bad one, you’ll have to overcome it.

Thursday, May 10, 2007

Your War, Not Mine!

by Victor Davis Hanson

"This war is lost," Sen. Majority Leader Harry Reid recently proclaimed. That pessimism about Iraq is now widely shared by his Democratic colleagues. But many of these converted doves aren't being quite honest about why they've radically changed their views of the war. Most of the serious Democratic presidential candidates -- Sens. Hillary Clinton, Joe Biden and Christopher Dodd, and former Sen. Jonathan Edwards -- once voted, along with Reid, to authorize the war. Sen. Barack Obama didn't. But, then, he wasn't in the Senate at the time. Now these former supporters of Iraq find themselves under assault by a Democratic base that demands apologies. Only Edwards has said he is sorry for his vote of support.


But if the Democratic Party is now almost uniformly anti-war, it is also understandable why it can't field a single major presidential candidate who was in Congress when it counted and tried to stop the invasion.


After all, responsible Democrats in national office had been convinced by Bill Clinton for eight years and then George W. Bush for two that Saddam's Iraq was both a conventional and terrorist threat to the United States and its regional allies.


Most in Congress accepted that Saddam was a genocidal mass murderer. They knew he used his petrodollars to acquire dangerous weapons. And they felt his savagery was intolerable in a post-9/11 world. There was no debate that Saddam gave money to the families of Palestinian suicide bombers or offered sanctuary to terrorists like Abu Abbas and Abu Nidal. And few Democrats questioned whether the al-Qaida-affiliated terrorist group Ansar al-Islam was in Kurdistan.


In other words, Democrats, like most others, wanted Saddam taken out for a variety of reasons beyond fears of WMD. Moreover, it was the Clinton-appointed CIA director George Tenet who supplied both Democrats and Republicans in Congress with much of the intelligence they would later cite in deciding to attack Saddam.


When both congressional Democrats and Republicans cast their votes to go along with President Bush, they even crafted 23 formal causes for war. So far only the writ concerning the fear of stockpiles of weapons of mass destruction has in hindsight proven false.


But we no longer hear much about these various reasons why the Democrats understandably supported the removal of Saddam Hussein. Instead, they now most often plead they were hoodwinked by sneaky warmongering neocons or sexed-up partisan intelligence reports.
There is nothing wrong with changing your mind, especially in matters as serious as war -- but the public at least deserves a sincere explanation for this radical about-face.


So why not come clean about their changes of heart?


Many Democrats apparently think that claiming they were victimized by Bush and the neocons is more palatable than confessing to their own demoralization with the news from the front. Others may fear that admitting publicly that a disheartened America should not or cannot finish a conflict would send a dangerous message to our enemies. So while these Democrats accuse President Bush of being hardheaded and unwavering on Iraq, they are still afraid that their own mea culpas would send an equally dangerous message of inconsistency abroad.


Democrats need to admit the truth: that removing a dangerous Saddam Hussein and promoting democracy in his place seemed a good idea to them in 2003-4 when the cost appeared tolerable. Now, in 2007, with over 3,000 American lives lost in Iraq, they feel differently.


In other words, Democrats could argue that somewhere along the line -- whether it was after Fallujah or the start of sectarian Sunni-Shiite violence -- they either lost confidence in the United States' very ability to stabilize Iraq, or felt that even if we could, it was no longer worth the tab in American blood and treasure. That confession could, of course, be nuanced with exculpatory arguments about the mistakes made by those in the Bush administration, such as: "Our necessary war that I voted for to remove Saddam worked; your optional one to stay on to promote democracy didn't." Such an explanation of turnabout would be transparent and invite a public discussion. And it would certainly be more legitimate that the current protestations of "the neo-cons made me do it."

With America still engaged in a tough war, that kind of excuse-making just doesn't cut it.

Thursday, April 26, 2007

The War is Over!

Today was a milestone in history. The Democrats in Congress have made it very clear that when they take control of the White House, they will immediately surrender to the Islamic fanatics that are tearing Iraq apart. Of course, the results will be catastrophic in the long run, but nary a Democrat could care less.

The Roman Empire did not collapse because it lost a war. Its collapse started with the Roman Senate deciding to cut the pay of far away soldiers defending the frontiers. Within a few years, the Roman citizens who were serving in the army departed the service, leaving only mercenaries. Sometime after that, several of the generals, who themselves were mostly from the hinterlands, began to leave as well, forming roving bands that extracted ransoms from various towns and villages to pay their way. By 400 A.D., one of these bands was so successful that it began to move toward Rome. In 410, its leader, Alaric, laid seige and demanded tribute from the city of Rome itself. As you may know, Rome was no longer the capitol of the Empire, it having been moved to Constaninople. But when the citizens were unable to continue monthly payments, Alaric cut off the water supply. Within two months, the city's population had declined by half, and the remainder threw open the gates. The barbarians raided Rome for two weeks, taking everything they had a desire for. Then they left. Roman citizens were shattered by this defeat, and never stood off against an armed force again, even in World War II.

A few weeks ago, I encouraged the election of a Democrat for President in the hopes that the responsibilities of office would correct their obvious blindness. It is probably too late for that.

Within 15 years, we will be engaged in armed conflict against Islamic extremists here in North America, and we will continue to lose. A nation without a spine is no match for a determined enemy that wishes to impose their will on its captives, or kill them.

Victor Davis Hanson provides a more optimistic view in this article: http://www.realclearpolitics.com/articles/2007/04/is_the_war_on_terror_over.html

Thursday, March 22, 2007

Global Warming Baloney II

From the New York Post:
AL'S WARMING LIES
By IAIN MURRAY


March 22, 2007 -- AL Gore was born and spent most of his life in Washington, D.C. Yesterday, he returned to the fever swamp to show he's forgotten none of his old political tricks. Addressing the House and Senate on global warming, he put forth a litany of half-truths that he twisted into a morality tale. But the facts tell a different story. The former veep is a master politician, not a prophet or a planetary savior.

Gore's biggest rhetorical trick is saying that the Earth has a fever. He says that 10 of the hottest years in history came in the last 11 years, and this proves we must do something, because, "If your baby has a fever, you go to the doctor."

This is meaningless. The Earth has been much, much hotter in the past than today. No giant space nanny fed it medicine.

Moreover, a healthy baby has a constant temperature - that's why a fever is bad. The Earth does not have a constant temperature. It has been generally warming since the end of the Little Ice Age in the early 19th century, but that has not been uniform. It's had warming phases (the 1920s and 1930) and cooling phases (the 1940s to 1970s).

It's also had periods like today, when temperatures are flat - there hasn't been much warming since 1998. Yes, it's warmer today than it was a hundred years ago, but that's not necessarily a bad thing. Talking about fevers is misleading, but it's a great rhetorical trick.

And when it comes to the economics of the issue, Gore is way outside the mainstream.

Appearing before a House committee, he said that changing the American economy in the way he proposes - a plan of freezes, taxes, market controls and regulations that would represent a massive expansion of government control over the economy - would not be costly.

Yet he also endorsed the ill-fated Kyoto Protocol (which he helped negotiate). The U.S. Energy Information Administration calculates that Kyoto would reduce U.S. gross domestic product by $100 billion to $400 billion a year.

Gore is a very wealthy man, but it's hard to see why he can't recognize that this is a lot of money lost - and a lot of jobs lost and a lot of families going cold and hungry.

How does Gore address this point? He doesn't; he simply avoids it, with highfalutin rhetoric. It's not just the Earth's "fever" and our supposed moral duty to cure it; he says our descendants will either condemn us as blind or praise us for our moral courage. He also makes veiled references to himself as Churchill, while all around him others appease fascism.

It's not subtle stuff - nor accurate.

If you establish that the Earth is warming, it doesn't necessarily follow that we have a moral duty to reduce emissions. What should follow is an informed debate about the costs and benefits of various policies to address that warming - reducing emissions is just one possible answer. Another debate should focus on those policies' economic costs.

Al Gore doesn't want to have those debates, because the majority of evidence suggests that emissions reduction will be very costly and will have little effect. Kyoto, fully enacted by all its parties, would for all its cost reduce global warming by a mere 0.07 degrees Celsius by 2050 - a barely detectable amount.

Meanwhile, 2 billion people around the world go without electricity. About 3 million die each year because of fumes given off by primitive stoves. The U.S. economy sneezes when gasoline hits $3 a gallon.

If we have a moral duty, it's to keep energy affordable here and to expand access to it overseas. That's the real moral truth, however inconvenient for Al Gore.

Iain Murray is senior fellow in Energy, Science and Technology at the Competitive Enterprise Institute in Washington, D.C.

Wednesday, February 28, 2007

Ralph Peters has it right!

The roots of today's wars
by Ralph Peters

President Bush’s refrain about Iraq is that we’re engaged in a ‘war of ideas.’ Not true. Our enemies are waging wars of religion and ethnicity, whether we like it or not. Critics have made the case that insurgencies can’t be defeated. Wrong again.

I cringe each time President Bush repeats his claim that we're engaged in "a battle of ideas." We're not. Our enemies aren't fighting about ideas, but over fundamental issues of identity: faith and ethnicity. Their motivations make them far more implacable, and even crueler, than yesteryear's ideological opponents.

In Washington, Republicans and Democrats alike are lost in history, clinging to an outmoded, if comfortable, view of the world as we wish it to be, rather than as it is. But we face a radically changed global environment that makes nonsense of the last century's theories of international relations and the ability to regulate warfare. An epoch has ended, and a new historical period — with terrifying new rules — has begun.

From 1789 and the French Revolution until the Soviet Union's disintegration in 1991, humankind took a bizarre historical detour through the Age of Ideology, when hundreds of millions — if not billions — of people accepted the notion that intellectuals and other charlatans could design better systems of social and political organization than had arisen naturally.

The arrogance of men such as Karl Marx, Adolf Hitler and Mao Zedong in believing that they could compress human complexity into their scribbled utopian visions may have been stunning, but the willingness of the masses to put their faith in such systems was a form of collective madness.

Inevitably, human beings disappointed the demagogues who tried to perfect humanity. Leaders responded by forcing men and women to fit the "ideal" pattern and the quest for utopia led inexorably to the gulag and Auschwitz, to Mao's Cultural Revolution, the killing fields of Cambodia or, at best, the poverty of today's Havana.

The Cold War was a battle of ideas. Iraq isn't.

Back to the mainstream:
The Age of Ideology still echoes in Latin America, but the great "isms" of the 19th and 20th centuries are essentially dead, unlikely to rise from the grave. Unfortunately, it doesn't mean we've entered a new era of peace: We've simply returned to the mainstream of history, to conflicts over religion and ethnicity.

As globalization paradoxically revived old identities of faith and tribe in traditional societies, such default allegiances became worth fighting for again. Men are once more killing to please an angry god or to avenge (real or imagined) ethnic wrongs.

The turmoil in Iraq and Afghanistan today, and that which we are bound to face elsewhere tomorrow, is asymmetrical not only in military terms, but in the motivations that stoke the violence. We have ideas, ranging from the universal validity of individual freedom and the power of democracy, to equal rights for women. Our enemies have passions — the ecstatic intoxication of faith and the Darwinian bitterness of the tribe — that give them a ferocious strength of will.
Iraq has been a terrible disappointment to many who believed in the galvanizing power of our ideas. Instead, we unleashed the killing power of faiths struggling for supremacy and the savagery of ethnic strife. This is the warfare of the Old Testament, of the book of Joshua, an ineradicable pattern of human behavior. For our part, we try to fight with lawyers at our elbows.
Our two major political parties may have different views on Iraq, but what's deeply worrisome is their shared view of the world as amenable to the last century's solutions: Negotiations first and foremost, with limited war when negotiations fail. But our enemies are only interested in negotiations when they need to buy time, while our limited approach to warfare only limits our chance of success.

Washington's unwillingness to face the new global reality is compounded by our ignorance of history — which lets spurious claims pass as facts. For example, talking heads somberly assure us (vis-à-vis Iraq) that insurgencies are virtually impossible to defeat. That's false. Over the past 3,000 years, insurgencies and revolts have failed overwhelmingly. It was only during the brief and now-defunct Age of Ideology that insurgents scored substantial victories — usually because imperial powers were already in retreat and anxious to leave the territory the insurgents contested.

The bad news here is that, while throughout history most insurgencies failed, they had to be put down with substantial bloodletting. Across three millennia, I can find no major religion-driven insurgency that was suppressed without significant slaughter.

Even the insurgencies of the Age of Ideology failed more often than not: French savagery won the Battle of Algiers, but the victory came too late because the French people had already given up on the struggle (a foretaste of Iraq?). The British destroyed the Mau Mau movement in Kenya with hanging courts, concentration camps and resolute military action — then left because they had no interest in remaining.

What has worked:
Historically, the common denominator of successful counterinsurgency operations is that only an uncompromising military approach works — not winning hearts and minds nor a negotiated compromise. This runs counter to our politically correct worldview, but the historical evidence is incontestable.

Simply because the truth is hateful to us doesn't mean that we can declare it false. We have entered a grim new age in which we must cope simultaneously with a return to old-fashioned wars of blood and belief, with the fatally flawed borders left behind by European imperialism, with the destabilizing effects of the information age on traditional societies, and with the explosion of our cherished myths about the pacific nature of humankind.

There were many things we failed to understand about Iraq, but our comprehensive mistake has been failing to understand our place in history.

Ralph Peters is a member of USA TODAY's board of contributors and the author, most recently, of Never Quit the Fight.

Wednesday, February 21, 2007

Global Warming Baloney!

One of the hot topics of our time is the talk of human-caused Global Warming. We should not be surprised that so much attention is devoted to this nonsense issue. First, it postulates a great global catastrophe, which all media loves. Second, the nature of the looming disaster REQUIRES massive central governmental action and control, which socialists and communists of all stripes love. Third, the entire argument is cloaked in scientific mumbo-jumbo, which sidesteps the need for discussion and debate, which psuedo-intellectuals and academics love. Finally, the issue is being publicized through a spokesman, Al Gore, that the Left views as a symbol of What's Wrong With America, namely the Stolen Florida Election, which cast the United States back into a dark age of Republican rule.

I only ask that we step back a bit and assess the claims of the Global Warming enthusiasts. The key to their entire argument is that rising Global Warming Gases (GWG) are the direct result of increasing human activity. If that argument doesn't fly, then the rest of their case falls apart. With that in mind, consider the following from Pete duPont:

Plus Ça (Climate) Change
The Earth was warming before global warming was cool.

BY PETE DU PONT
Wednesday, February 21, 2007 12:01 a.m. EST


When Eric the Red led the Norwegian Vikings to Greenland in the late 900s, it was an ice-free farm country--grass for sheep and cattle, open water for fishing, a livable climate--so good a colony that by 1100 there were 3,000 people living there. Then came the Ice Age. By 1400, average temperatures had declined by 2.7 degrees Fahrenheit, the glaciers had crushed southward across the farmlands and harbors, and the Vikings did not survive.

Such global temperature fluctuations are not surprising, for looking back in history we see a regular pattern of warming and cooling. From 200 B.C. to A.D. 600 saw the Roman Warming period; from 600 to 900, the cold period of the Dark Ages; from 900 to 1300 was the Medieval warming period; and 1300 to 1850, the Little Ice Age.

During the 20th century the earth did indeed warm--by 1 degree Fahrenheit. But a look at the data shows that within the century temperatures varied with time: from 1900 to 1910 the world cooled; from 1910 to 1940 it warmed; from 1940 to the late 1970s it cooled again, and since then it has been warming. Today our climate is 1/20th of a degree Fahrenheit warmer than it was in 2001.

Many things are contributing to such global temperature changes. Solar radiation is one. Sunspot activity has reached a thousand-year high, according to European astronomy institutions. Solar radiation is reducing Mars's southern icecap, which has been shrinking for three summers despite the absence of SUVS and coal-fired electrical plants anywhere on the Red Planet. Back on Earth, a NASA study reports that solar radiation has increased in each of the past two decades, and environmental scholar Bjorn Lomborg, citing a 1997 atmosphere-ocean general circulation model, observes that "the increase in direct solar irradiation over the past 30 years is responsible for about 40 percent of the observed global warming."
Statistics suggest that while there has indeed been a slight warming in the past century, much of it was neither human-induced nor geographically uniform. Half of the past century's warming occurred before 1940, when the human population and its industrial base were far smaller than now. And while global temperatures are now slightly up, in some areas they are dramatically down. According to "Climate Change and Its Impacts," a study published last spring by the National Center for Policy Analysis, the ice mass in Greenland has grown, and "average summer temperatures at the summit of the Greenland ice sheet have decreased 4 degrees Fahrenheit per decade since the late 1980s." British environmental analyst Lord Christopher Monckton says that from 1993 through 2003 the Greenland ice sheet "grew an average extra thickness of 2 inches a year," and that in the past 30 years the mass of the Antarctic ice sheet has grown as well.

Earlier this month the U.N.'s Intergovernmental Panel on Climate Change released a summary of its fourth five-year report. Although the full report won't be out until May, the summary has reinvigorated the global warming discussion.
While global warming alarmism has become a daily American press feature, the IPCC, in its new report, is backtracking on its warming predictions. While Al Gore's "An Inconvenient Truth" warns of up to 20 feet of sea-level increase, the IPCC has halved its estimate of the rise in sea level by the end of this century, to 17 inches from 36. It has reduced its estimate of the impact of global greenhouse-gas emissions on global climate by more than one-third, because, it says, pollutant particles reflect sunlight back into space and this has a cooling effect.

The IPCC confirms its 2001 conclusion that global warming will have little effect on the number of typhoons or hurricanes the world will experience, but it does not note that there has been a steady decrease in the number of global hurricane days since 1970--from 600 to 400 days, according to Georgia Tech atmospheric scientist Peter Webster.

The IPCC does not explain why from 1940 to 1975, while carbon dioxide emissions were rising, global temperatures were falling, nor does it admit that its 2001 "hockey stick" graph showing a dramatic temperature increase beginning in 1970s had omitted the Little Ice Age and Medieval Warming temperature changes, apparently in order to make the new global warming increases appear more dramatic.

Sometimes the consequences of bad science can be serious. In a 2000 issue of Nature Medicine magazine, four international scientists observed that "in less than two decades, spraying of houses with DDT reduced Sri Lanka's malaria burden from 2.8 million cases and 7,000 deaths [in 1948] to 17 cases and no deaths" in 1963. Then came Rachel Carson's book "Silent Spring," invigorating environmentalism and leading to outright bans of DDT in some countries. When Sri Lanka ended the use of DDT in 1968, instead of 17 malaria cases it had 480,000.
Yet the Sierra Club in 1971 demanded "a ban, not just a curb," on the use of DDT "even in the tropical countries where DDT has kept malaria under control." International environmental controls were more important than the lives of human beings. For more than three decades this view prevailed, until the restrictions were finally lifted last September.

As we have seen since the beginning of time, and from the Vikings' experience in Greenland, our world experiences cyclical climate changes. America needs to understand clearly what is happening and why before we sign onto U.N. environmental agreements, shut down our industries and power plants, and limit our economic growth.



Mr. du Pont, a former governor of Delaware, is chairman of the Dallas-based National Center for Policy Analysis. His column appears in the Wall Street Journal once a month.

Monday, February 19, 2007

It's Time for the Democrats to Take Over!

It's been more than three months since I last posted. And it has been a difficult ninety days. Our country is struggling in a dangerous world. President Bush and the GOP are facing all-time low levels of public support. Yet we have the best economic conditions in twenty years. So, we are fat and happy. Most of us could care less what happens in Iraq or North Korea, or anywhere else, for that matter. What's important is what's happening to our favorite celebrities (wasn't the premature death of Anna Nicole Smith a tragedy?).

We are ready, as a nation, for an extended period of isolationism, and we have just the people to lead us into that new era of head-in-the-sand stupidity: the Democrats. Today, I am declaring myself in favor of a Democratic presidency, beginning in 2008!

Now some of you may think that such a turn of events will cause great harm to our Republic. Of course, you are correct. But we can afford a few years of socialism, utter stupidity, and other associated silliness (we survived eight years of Clinton, didn't we?). What we cannot afford for a generation is the spectre of nearly half our people behaving so irresponsibly as to create a danger for all of us. Today, in the Senate and the House, we have our highest-paid public servants acting like complete jackasses. There is a reason for this behavior: Democrats do not take the threat of Islamo-fascism seriously. Almost all of them believe that a Republican in the White House is more dangerous than a group of Muslim fanatics on an airplane.

The only way to cure them of that blindness is to put them in charge, which is what my friend Jonah Goldberg advocates in his most recent column in National Review:

Jonah Goldberg's column


Some long-term damage will result, whether it's Hillary, B.O., or any of the other wannabes (please, Dear God, don't let it be Joe "Blowhard" Biden). But whoever it is will have to face reality within days or, at the most, weeks of the 2009 inauguration. Shortly thereafter they will call meetings with key Capitol Hill Democrats and explain the Truth to them. The result will be a return to a realistic, bi-partisan foreign policy during a time of war. The last time such a thing occurred was more than forty years ago, in 1966, when Lyndon Johnson was President. Before that it was 1948 when Harry Truman was President. And before that it was 1941, when Franklin Roosevelt was in office.

Eventually, the Democrats will screw it up so badly that Republicans will be returned to office, probably in 2012. We were able to stomach Jimmy Carter for only four years, remember. But in the meantime, for the sake of the country, we need a Democrat in the White House.