zaterdag 17 oktober 2020

REMEMBER: “Mandate for Change,” or Business as Usual

 

“Mandate for Change,” or Business as Usual

Noam Chomsky

Z Magazine, February, 1993

As the victors were recovering from the celebration of their electoral triumph in November, the front-page headlines read: “The Economy: Aides plan a scaled-back early agenda.” “A group of aides to President-elect Bill Clinton is preparing an economic plan for the Democrat’s early days in office that would postpone action on some of his more sweeping proposals in favor of a moderate increase in infrastructure spending, preliminary steps to control rising health care costs and a series of business tax breaks that rated little mention during the campaign,” political correspondent Peter Gosselin reported. In the following days, Clinton’s advisers, including those on the progressive fringe, reiterated the message, which had been understood all along by corporate-financial sectors; the steadiness of the markets and strength of the dollar were among the many indications of the general satisfaction with Clinton’s imminent victory in the business world, which was soon to be reassured further by his top appointments and indication of priorities.

The magic word in Clinton’s campaign had been “Change,” a reorientation of policy toward the needs of the great majority of the population who had suffered from Reagan-Bush “trickle down” economics — in practice, an upward flood — and had swept Clinton into office on the promise of an end to the party for the rich. But it would be unfair to speak unkindly of the newly-elected President for clarifying at once that the fine words of the campaign were not intended seriously, that the “Mandate for Change” proclaimed by a Clinton think tank meant “Business as Usual,” as it did when Eisenhower’s PR team coined the phrase. “Campaign pledges [are] made to be broken,” Harvard political scientist and media specialist Marty Linsky explained when President Bush called for “revenue enhancement” after winning the 1988 election with a pledge not to raise taxes. To accuse Bush of violating his campaign pledge was a “political cheap shot.” When he led the public in his “read my lips — no new taxes” chant, Bush had merely been expressing his “world view,” making “a statement of his hopes.” The same precepts hold for his successor.

Only the most naive, who do not comprehend the democratic system, could think that their political representatives mean what they say. Sophisticates understand that “elections and governing are different ball games, played with different objectives and rules.” “The purpose of elections is to win,” Linsky elaborated, expressing the contempt for democracy that is standard fare among educated elites; and “the purpose of governing is to do the best for the country” — where “the country” is to be understood as “those who matter,” though honesty on that score as well would be too much to expect.

This course of instruction is helpful. The lessons have broad application. Take the concept “jobs.” It is beyond doubt that more and better jobs are desperately needed. “Job destruction [is] worse than we thought,” economists Lawrence Mishel and Jared Bemstein report, with “more than 17 million workers, representing 13.2 percent of the labor force, …unemployed or underemployed in July [1992],” a rise of 8 million during the Bush years. Furthermore, some three-fourths of the rise in unemployment is permanent loss of jobs. Meanwhile the stagnation of real wages changed to sharp decline from the mid-1980s, extending even to college-educated, while “of the gain in income per head, 70 percent accrued to the top 1 percent of income earners, while the bottom lost absolutely,” MIT economist Rudiger Dombusch observes, so that “For most Americans, it is no longer true that the young generation can count on being economically ahead of its parents,” a significant turning point in the history of industrial society.

One can therefore appreciate the passionate concern expressed by political figures, corporate leaders, and their press agents over the need to create jobs for suffering Americans. Heartening indeed, until we recall our lessons. Looking a bit more closely, we find that the word “jobs” has taken on an entirely new meaning: “profits.” Thus when George Bush takes off to Japan with a bevy of auto executives in tow, he waves the banner “jobs, jobs, jobs,” meaning “profits, profits, profits,” as a look at his social and economic policies demonstrates without equivocation. The press and air waves resound with promises to increase “jobs,” put forth by those who do what is in their power to send them to high-repression, low-wage regions, and to destroy what remains of meaningful work and workers’ rights, all in the interest of some unmentionable seven-letter word. All becomes clear once we join the sophisticates who understand that “the purpose of rhetoric is to delude,” while “the purpose of governing is to do the best for `the country'” in the technical sense of that term.

That “the country” in whose interests policy is designed is to be understood in class terms is, of course, no recent insight. Those who have been properly instructed in the famous “canon” will surely have “learned” that Adam Smith denounced the mercantilist and colonial systems as harmful and absurd, while preaching the virtues of free trade. As explained in the introduction to the Chicago bicentennial edition of Wealth of Nations by noted Chicago economist George Sligler, “Americans will find [Smith’s] views on the American colonies especially instructive. He believed that there was, indeed, exploitation — but of the English by the colonists,” contrary to what the uninitiated might think. But few are likely to have discovered what Smith actually wrote. Mercantilism and colonialism may have harmed the general population of England, Smith concluded, but were of great benefit to the “merchants and manufacturers” who “have been by far the principal architects” of policy; their interests have “been most peculiarly attended to” by the system, though not the interests of consumers and working people. (As for the English colonists in America, the harsh regulations imposed upon them by imperial England were “a manifest violation of the most sacred rights of mankind,” Smith wrote, though nothing like the “savage injustice” of the treatment of the lesser breeds).

In brief, mercantilism and empire were among the many devices that regularly shape social and economic policy into a welfare project for the rich and powerful — definitely not the right lesson for impressionable young minds. In this and many other ways, the contents of Smith’s classic have been crucially modified as they enter contemporary ideology in the hands of his latter-day disciples, to form part of the “free market” theology preached by a variety of cynics.

Smith’s class analysis of policy-formation retains its relevance, and Linsky’s lessons add a useful supplement. If “change” once again translates into new methods of enhancing existing power and privilege, we need only recognize that that is what is best for “the country,” properly construed. Returning to the “sweeping proposals” that were instantly recognized to be “unnecessary” and “inappropriate,” we may recall that the public was not merely indicating a preference for curtailing “rising health care costs” and for unspecified “infrastructure spending.” Over two-thirds of the public have regularly called for the kind of national health insurance that exists in one or another form in all other industrial societies, in place of the highly bureaucratized “private enterprise” U.S. system, with its harsh rationing of health care by price and huge administrative expenses and other inefficiencies (the British National Health Service devoted 4 percent of total health-care expenditure to administrative costs in the late 1980s as compared with 21 percent in the U.S., the distinguished British conservative Ian Gilmour points out; the comparison with Canada is similar). And an overwhelming majority (83 percent in the latest Harris poll) protested that “the rich are getting richer and the poor are getting poorer,” that “the economic system is inherently unfair,” as the president of the polling organization summarized popular feelings. Small wonder that Reagan’s popularity is barely above Nixon’s, far below other living ex-Presidents, and that he is particularly disliked by working people and “Reagan Democrats.”

But on a wide array of matters such as these, Clinton cannot be criticized, even unfairly, for quick abandonment of “campaign pledges” that are “made to be broken,” since the pledges were never made. Both political factions, along with their sponsors and the ideological managers, understand that the childish confusions of the rabble need not be considered by the “men of virtue” who have assumed the responsibility of governing since the origins of the Republic, always seeking what is best for “the country.”

Popular confusions extend quite far. Through the 1980s, considerable majorities favored a nuclear freeze, social spending over military spending, more government regulation to protect worker health and safety, higher taxes if necessary for such purposes, and so on. The errors of the rabble are also prevalent among the other beneficiaries of the Reagan-Thatcher revolution. The British Social Attitudes survey for 1992 finds that “respondents came out in favour of public spending by bigger margins than ever,” the London Guardian reports, with 65 percent favoring higher taxes and more spending.

Perhaps they are reacting to Thatcher’s achievements in creating the worst crisis for manufacturing industry in the 19th-20th century, destroying almost one-third of the manufacturing plant within a few years by blind pursuit of Friedmanite and laissez-faire doctrines that were falsified and failed at every turn, yielding a “miserable performance” for the economy through 1990, lowering growth rate, rapidly increasing poverty while playing “Good Samaritan only to the better off” and giving London almost the appearance of “a third-world capital” — despite the huge shot-in-the-arm provided by North Sea oil and the sharp decline in prices of Third World exports (Ian Gilmour, in his incisive review of a decade of “Dancing with Dogma”). The result was to send Britain to “Europe’s poorhouse,” the Financial Times observes in October 1992, “technically poor enough to apply for extra European Community cash” along with Spain, Ireland, Portugal, and Greece. Much the same happened in Australia, where a Labor government tried the same “cruel experiment” with the same consequences, a “dismal tale of economic failure,” conservative Robert Manne points out in the business press, reviewing the well-documented “disaster.”

The destructive impact of neoliberal dogma on Third World societies has been extensively discussed. Less familiar is the fact that the three English-speaking societies, which danced with the same dogmas (though only to a limited extent, being powerful enough to violate the rules) suffered accordingly, a fact that should “have, at the very least, planted the seeds of doubt,” Manne comments, with reference to Australia. In all three societies the doubts were allayed by what MIT economist Paul Krugman describes as a “combination of mendacity and sheer incompetence,” referring specifically to attempts to suppress the truth “by the Wall Street Journal, the U.S. Treasury Department, and a number of supposed economic experts,” a record that demonstrates “the extent of the moral and intellectual decline of American conservatism,” a record matched in England and Australia.

As in the United States, the rabble in Britain have dangerous thoughts about the private economy. Asked how profits should be distributed, 42 percent chose investment, 39 percent workforce benefits, 14 percent consumer benefits (lower prices), and 3 percent share-holders/managers benefits. Asked how profits would be distributed, 28 percent predicted investment, 8 percent workforce benefits, 4 percent consumer benefits, and 54 percent shareholders/managers benefits. The conviction that the economic system is “inherently unfair” is widely shared, but well beyond the reach of the political system in societies that have succeeded in reducing the general public to a spectator role, as leading democratic theorists have long urged.

Problems of Governance

While the two factions of the business party agree over a broad range, they differ in popular constituency and sometimes in tactical preferences. These are only tendencies, reflecting shifting alliances, but they are real and sometimes have policy consequences. The popular base of the Democrats lends more towards working people, the poor, women, minorities — the rabble generally. The Republicans, who have been more open and forthright in presenting themselves as the party of owners and managers, have sought to create a popular base through appeal to jingoism, fear, religious fanaticism, and the like. That provides substantial outreach. Religious fundamentalists alone are a huge popular grouping in the United States, which resembles pre-industrial societies in that regard. This is a culture in which three-fourths of the population believe in religious miracles, half believe in the devil, 83 percent believe that the Bible is the “actual” or the inspired word of God, 39 percent believe in the Biblical prediction of Armageddon and “accept it with a certain fatalism,” a mere 9 percent accept Darwinian evolution while 44 percent believe that “God created man pretty much in his present form at one time within the last 10,000 years,” and so on. The “God and Country rally” that opened the national Republican convention is one remarkable illustration, which aroused no little amazement in conservative circles in Europe.

Needless to say, neither political faction offers its popular constituency any real influence over matters of importance to “the country,” but their concerns can be addressed at the margins. The dramatic assault on civil liberties during the Reagan years is a case in point. With a different popular base, Clinton will doubtless mitigate these policies, a matter of no small significance for personal life, though with only marginal impact on the primary task: to ensure the proper functioning of the welfare state for the rich.

Here objective problems arise that cannot be ignored, and there are differing perspectives within the business community (hence the political system) as to how they should be addressed. One major concern is “industrial policy”; that is, the state role in sustaining private enterprise; the role of the state in securing “jobs,” to resort to Politically Correct Newspeak. This was perhaps the major real issue in the 1992 election.

It is hardly a secret that every successful industrial society, from England to the East Asian NICs, achieved that condition and maintains it by radically violating market principles. These principles do serve useful functions: they can be selectively invoked to restrict social spending, to undercut competitors, and to open Third World societies to more efficient exploitation (including now much of Eastern Europe). Within the ideological system, therefore, neoliberal doctrine is highly praised. But business has always insisted that a powerful state intervene to regulate disorderly markets, organize a public subsidy for advanced industry, suppress labor and independent forces at home and abroad, and in other ways protect the interests of those who control investment and finance, and thus set the general conditions within which the “architects of policy” meet their responsibilities. Any illusions that capitalism might be a viable system vanished — apart from the secular theologians — with the Great Depression and the successful recovery from it under the wartime command economy, administered by corporate executives who learned their lessons well.

Like all advanced societies, the U.S. has relied on state intervention in the economy from its origins, though for ideological reasons, the fact is commonly denied. During the post-World War II period, such “industrial policy” was masked by the Pentagon system, including the Department of Energy (which produces nuclear weapons) and NASA, converted by the Kennedy administration to a significant component of the state-directed public subsidy to advanced industry.

By the late 1940s, it was taken for granted in government-corporate circles that the state would have to intervene massively to maintain the private economy. In 1948, with postwar pent-up consumer demand exhausted and the economy sinking back into recession, Truman’s “cold-war spending” was regarded by the business press as a “magic formula for almost endless good times” (Steel), a way to “maintain a generally upward tone” (Business Week). The Magazine of Wall Street saw military spending as a way to “inject new strength into the entire economy,” and a few years later, found it “obvious that foreign economies as well as our own are now mainly dependent on the scope of continued arms spending in this country,” referring to the international military Keynesianism that finally succeeded in reconstructing state capitalist industrial societies abroad and laying the basis for the huge expansion of Transnational Corporations (TNCs), at that time mainly U.S.-based.

The Pentagon system was considered ideal for these purposes. It imposes on the public a large burden of the costs (research and development, R&D) and provides a guaranteed market for excess production, a useful cushion for management decisions. Furthermore, this form of industrial policy does not have the undesirable side-effects of social spending directed to human needs. Apart from unwelcome redistributive effects, the latter policies tend to interfere with managerial prerogatives; useful production may undercut private gain, while state-subsidized waste production (arms, Man-on-the-Moon extravaganzas, etc.) is a gift to the owner and manager, who will, furthermore, be granted control of any marketable spin-offs. Furthermore, social spending may well arouse public interest and participation, thus enhancing the threat of democracy; the public cares about hospitals, roads, neighborhoods, and so on, but has no opinion about the choice of missiles and high-tech fighter planes. The defects of social spending do not taint the military Keynesian alternative, which had the added advantage that it was well-adapted to the needs of advanced industry: computers and electronics generally, aviation, and a wide range of related technologies and enterprises.

The Pentagon system of course served other purposes. As global enforcer, the U.S. needs intervention forces and an intimidating posture to facilitate their use. But its economic role has always been central, a fact well-known to military planners. Army Plans Chief General James Gavin, in charge of Army R&D under Eisenhower, noted that “What appears to be intense interservice rivalry in most cases…is fundamentally industrial rivalry.” It was also recognized from the outset that these goals require “sacrifice and discipline” on the part of the general public (NSC 68). It was therefore necessary, Dean Acheson urged “to bludgeon the mass mind” of Congress and recalcitrant officials with the Communist threat in a manner “clearer than truth,” and to “scare hell out of the American people,” as Senator Vandenberg interpreted the message. To carry out these tasks has been a prime responsibility of intellectuals throughout these years.

Public acquiescence was largely secured by fear. By the 1980s, however, the cry that “the Russians are coming” was losing its efficacy. The problem of the vanishing pretext was a troublesome one throughout the decade, heightened by the erosion of public tolerance in the face of growing economic problems. Major propaganda efforts were undertaken to conjure up new demons: international terrorism, Qaddafi and crazed Arabs generally, Sandinistas marching on Texas, Hispanic narcotraffickers, etc. The absurdity of the pretexts did not prevent them from having a certain effect though with only temporary success, a problem that must be faced.

Aiding the Pentagon

In passing, we may note that the current PR campaign in Somalia has similar motives, a fact that is scarcely even disguised. The First Landing was carefully staged for TV. Pentagon briefings directed journalists to where they were wanted, even advising them when and where “to set up their cameras” (New York Times). Pentagon officials encouraged “extensive media coverage in a bid to cast the US mission in the most positive light,” the Washington Post reported, noting “the invasion’s made-for-Hollywood quality,” which aroused considerable ridicule in Europe, and occasionally here. These officials were “eager to advertise both to Somalia and the rest of the world the precedent-setting humanitarian mission,” the Post reported, omitting the quotes around the last two words that authentic journalism would require.

The operation will be “a good experience for other countries and for us to see what effect American generosity has on these types of disaster,” the overseas relief chief of USAID Andrew Natsios, stated: “millions of lives will be saved Americans should feel very good about themselves” — and about the Pentagon budget that allows such miracles of generosity. Officials “didn’t hide the fact that they wanted to make it as easy as possible for the news media to cover an event that portrays them in a good light,” Peter Grier reported in the Christian Science Monitor.“With the military budget crumbling, a little favorable publicity can only help.” For the Marine Corps, the operation is a “showcase…at a time when Congress is under intense pressure to produce post-Cold War defense savings,” the Post commented, and the whole affair is nothing less than “a public relations bonanza at just the right time.” JCS chair Colin Powell added that the effort is a “paid political advertisement” on behalf of plans for an intervention force. The military “convoys were more a symbolic show for the world’s television cameras than any serious effort to get a steady stream of food moving” New York Times correspondent Jane Perlez reported two weeks after the landing, under the heading “Somalia, We Are Here! (Now What Do We Do?).”

The intervention “seemed to be largely devoid of ulterior political motives,” Perlez added, a phrase that is obligatory even in reports that bring out clearly the overriding “political motives” and the great efforts to achieve the desired effect. These efforts were so obvious that only the most disciplined were able to suppress entirely what they knew and to marvel that the intervention “was justified solely on moral grounds” and thus put “the question of idealism in foreign policy rather purely” (New Republic editors, their emphasis).

The pretensions could hardly be taken seriously. If Washington had any humanitarian concerns for the people of Somalia, it had ample opportunity to act upon them from 1978 through 1990, when it was the major supporter of Siad Barre, the Saddam Hussein clone who was then destroying Somali society, killing 50-60,000 according the African Watch and setting the stage for the horrors that followed-facts regularly finessed in current media coverage. There is no evidence of a sudden religious conversion since. Furthermore, there are numerous “humanitarian missions” that could readily be undertaken if generosity were even a marginal element in policy-making. To take a case close to home, it is agreed on all sides that a few phone calls to the ruling Generals — not 30,000 troops — would probably suffice to call off the savage terror in Haiti and allow the return of the democratically-elected President Jean-Bertrand Aristide, highly popular in Haiti if not in Washington; this minimal intervention would also save any number of infants from starvation and disease. Examples abound. That aside, no one who even pretends to be serious will lend credence to a “humanitarian act” carefully staged for the world’s TV cameras — particularly, when it is undertaken by a great power with a horrifying record of abuse of human rights, and a particular penchant for imposing starvation and disease on civilian societies by economic warfare (Vietnam, Cuba, Chile, Nicaragua, Iraq, …).

States are not moral agents. “Generosity” and “humanitarian missions” are tools of the trade of the commissar class in every society. Perhaps some historical examples can be found of “humanitarian intervention,” but transparently, this is not one of them.

As is fully recognized, the troops were sent well after the civil society had begun to recover and the crisis was clearly receding. “One thing is certain,” Jane Perlez emphasizes: “the worst of the Somali famine of 1992 is past.” “The Worst Was Over” (a sub-heading reads) well before the U.S. forces arrived in December. By early November, aid agencies in the distribution center in Baidoa, where the crisis was unusually severe, reported that about 80 percent of aid was reaching the most needy, and by the end of the month, the ICRC and other experienced agencies were reporting still higher figures. Recovery from Siad Barre’s atrocities in the North had been substantial well before, and even in the region of greatest suffering in the South there was visible progress, thanks in part to the efforts of the highly-regarded UN mediator Muhammad Sahnoun, who was removed after his public criticism of the incompetence of the UN operations. Serious reservations about the character of the U.S. intervention were expressed by development and relief agencies and the few people really knowledgeable about Somalia and the problems of famine, among them Rakiya Omaar, the Somali head of Africa Watch who was dismissed when she publicly opposed the intervention, and her co-worker Alex de Waal, one of the leading specialists on African famines and East Africa, who resigned in protest. The American Friends Service Committee, which has carried out development programs and relief work in Somalia for over ten years and is implementing emergency programs today, concluded “on the basis of this direct experience and our knowledge of the country and its people” that the massive military intervention is a “grave mistake” that “may be counterproductive in the long if not the short run,” interrupting and disrupting the processes of reconstruction that “have been undertaken among traditional leaders facilitated by Ambassador Mohammed Sahnoun of the United Nations and others, to try to build peace from below.” Apparently reflecting similar perceptions, the International Red Cross (ICRC), which played by far the greatest part in responding to the terrible famine that peaked in summer 1992, refused to accept U.S. military escorts for fear that this would disrupt arrangements that hail been developing within Somali civil society. The British government pressured Oxfam and Save the Children, both dependent on government support, to call off their criticisms of the intervention.

Many expressed particular concern over U.S. dealings with the leading “warlords,” fearing that this may provide greater legitimacy and power to the most dangerous and destructive elements in the society. They seem to agree. Both General Mohammed Farrar Aideed the most powerful of these killers, and his ally Col. Omar Jess, who massacred over 100 civilian leaders in Kismayu in preparation for the arrival of the marines, “want to deal solely with the U.S.,” Julian Onne comments in the Financial Times, reporting on a protest by 500 demonstrators loyal to Aideed that disrupted the visit of UN Secretary General Boutros Ghali to Mogadishu, which Aideed controls.

There is good reason to believe that a more modulated approach in cooperation with Somali civil society could have been effective in enhancing the recovery already underway, along lines that have been presented by Omaar de Waal, and other close observers. The military operation that was so “eagerly advertised” may prove beneficial or harmful to Somalis in the long run, but that is incidental; they are basically props for photo opportunities.

In the UK, as here, it is commonly felt necessary to include ritual phrases about the “humanitarian mission” in analyses of actual motives. Economic correspondent Michael Prowse of the Financial Times describes the Somali intervention in these terms: “in the absence of the communist threat, the most reliable way to sustain public support for large military expenditures may be to base foreign policy on values the public holds dear. In today’s changed world [the Soviet threat having vanished], Mr. Clinton is thus being a realist, as much as an idealist, in pledging to make the promotion of democracy and human rights the guiding principles for overseas interventions” — PR devices that he did not invent, of course. Having exposed the propaganda, Prowse goes on to laud “The heartwarming presence of U.S. troops in Somalia,” where, “For the first time in recent U.S. history (perhaps ever), a sizable military intervention overseas was justified on purely moral grounds.” A high tolerance for self-contradiction is a virtual necessity for intellectual respectability, given the need to invest the actual workings of power with suitable majesty.

The day after the intervention, in an article on the U.S. economy not mentioning Somalia, Prowse cited U.S. economists in corporate and financial institutions who attribute the sluggishness of recovery from the recession to the decline in military spending, which eliminates a traditional device for stimulating the economy. In brief, the stakes in “sustaining public support for large military expenditures” are high.

According to the official version of the timing presented most fully by Don Oberdorder of the Washington Post on the basis of official leaks, the decision to intervene was taken on November 21 on the grounds that “the need is crying” and “only the United States can do something.” That story lacks any credibility; “the need was crying” months earlier, and was declining by late November thanks to the efforts of others.

It is possible that the intervention had been planned for the post-election period. In early November, a marine colonel in civilian clothes was seen by reporters in Baidoa, apparently scouting out the area where a major base would be established; at the time, U.S. military personnel were restricted to the cargo planes delivering supplies. A “humanitarian intervention” just before election day would have seemed too cynical a ploy, undermining the PR function. An earlier intervention would have faced two problems. The first is that the situation had not yet begun to settle. The operation would have been far more risky, and it was not so obvious then that the appearance of success could be quickly achieved; similar considerations rule out “humanitarian intervention” in Bosnia, even more strongly. Second, it is widely felt that things might go sour after the initial PR bonanza, and the Administration surely did not want to face such problems under the glare of election klieglights. The post-election timing is preferable. Order was being restored so the appearance of success is more likely. Bush’s term can end in a blaze of glory. If the “purely idealistic” effort turns into the usual disaster, on the model of Grenada, Panama, and so on, attention will have waned or someone else will have to pick up the pieces and suffer the political consequences.

As in earlier efforts to sustain the Pentagon system, the Somali intervention may serve other purposes. The U.S. supported Siad Barre through his worst atrocities because of its interest in Somali bases for the intervention forces aimed at the Middle East and for possible operations in Africa; such considerations might remain of some importance (not much, I suspect, alternatives being readily available). Furthermore, in large parts of Africa and the Middle East the rise of Islamic fundamentalism (which may well be accelerated by the intervention) is a matter of growing concern, for traditional reasons: like secular nationalist tendencies, liberation theology, labor and peasant organizing, democratic socialist political initiatives, some military regimes, and other potentially independent forces, Islamic fundamentalism falls under the rubric of “ultranationalism,” a term that covers any threat of deviation from the subordinate role assigned to the service areas, whatever its political coloration. Nevertheless, it seems likely that at the current moment, the prevailing factor is the domestic one, the crisis of state industrial policy, as the more serious commentary and reporting often indicates obliquely.

Industrial Policy for the 1990s

The decline of the traditional form of industrial strategy is a serious matter. To convince the taxpayer to subsidize advanced industry by the methods designed in the early postwar years is becoming increasingly difficult. It is not surprising, then, that we now hear open discussion of the need for “industrial policy” — that is, new forms, no longer masked by the Pentagon system.

The old methods were running into difficulties for reasons beyond the loss of the standard pretext and the erosion of tolerance on the part of people suffering the effects of Reaganite spend-and-borrow abandon. The Pentagon system of industrial subsidy and planning has obvious inefficiencies. These were tolerable in the days of overwhelming U.S. economic dominance, less so as U.S.-based corporations face serious competitors who can design and produce directly for the commercial market, not awaiting possible spin-offs from high tech weapons or space shots. Furthermore, the cutting edge of industrial development is shifting to biology-based technology. That is one reason why the West, with the U.S. in the lead, is insisting that GATT agreements and NAFTA (North American Free Trade Agreement) provide enhanced protection for patents (“intellectual property”), thus locking the Third World into dependency on high-priced products of Western agribusiness, biotechnology, the pharmaceutical industry, and so on. It is important to ensure that TNCs control seeds, plant varieties, drugs, and the means of life generally; by comparison, electronics deals with frills. Public subsidy and state protection for biology-based industries can not easily be hidden behind a Pentagon cover. For such reasons alone, new forms of state intervention are required (see Year 501, South End Press).

In the 1992 electoral campaign, the Democrats showed more awareness of these issues, gaining support from sectors of the corporate world that recognized them to be more attuned to real world problems than Reaganite ideologues. Not that Reaganites were reluctant to use state power to protect the wealthy from market forces. The primary mechanisms were the usual military Keynesian ones. To mention one striking case, a 1985 OECD study found that the Pentagon and Japan’s state planning ministry MITI were distributing R&D funds much the same way, making similar guesses about new technologies. A major Pentagon funnel was SDI (“Star Wars”), which was openly advertised as a state subsidy to the “private sector,” and lauded by the business press for that reason. The Reagan-Bush decade ended in fall 1992 with a well-publicized improvement in the economy, attributed in the business press to a sharp rise in military spending much of it for computer purchases. While almost all industrial societies became more protectionist in past years, at great cost to the Third World, the Reaganites led the pack, introducing more import restrictions than all postwar administrations combined. British MP Phillip Oppenheim, ridiculing Anglo-American posturing about “liberal market capitalism,” notes that “A World Bank survey of non-tariff barriers showed that they covered 9 per cent of all goods in Japan — compared with 34 per cent in the U.S. — figures reinforced by David Henderson of the OECD, who stated that during the 1980s the U.S. had the worst record for devising new non-tariff barriers” (basically, ways to strong-arm competitors). He adds that OECD figures show U.S. state funding for non-military R&D to be about one-third of all civil research spending, as compared to 2 percent state funding in Japan. The Thatcher record is similar.

The Reaganites also conducted the biggest nationalization in U.S. history (the Continental Illinois Bank bailout) and enabled the steel industry to reconstruct by effectively barring imports and undermining unions to reduce labor costs. They are leaving Washington with heavy new restrictions on European Community steel exports that the EC claims violate international trade rules; Washington’s justification is alleged EC dumping, but the EC responds that total EC steel exports had fallen below the “voluntary quota” (the Reaganite non-tariff barrier). The Reagan administration sharply increased export-promotion by means of Export-Import bank credits in apparent “violation of the GATT,” Eximbank chair John Macomber concedes. They conducted “what was effectively an `industrial policy'” (contrary to official rhetoric) that rebuilt the U.S. computer chip industry by such means as an agreement “essentially forced on Japan” to increase purchases of U.S. chips and by establishment of the government-industry consortium Sematech to improve manufacturing technology, the Washington Post reported, quoting Charles White, vice president for strategic planning at Motorola, the second-biggest U.S. chip maker, who said: “You can’t underestimate the government’s role.”

Despite such achievements, the Reagan-Bush faction remains hampered by ideological extremism, unable to face current problems of industrial strategy as directly as their political opponents, some elements of the corporate-financial world assume. Clintonite thinking on this issue is reflected in the choice of Berkeley Professor Laura Tyson as Chairperson of the Council of Economic Advisors. Tyson was a founder and codirector of the Berkeley Roundtable on the International Economy, a corporate-funded trade and technology research institute that advocates unconcealed state industrial policy. She has “longstanding relationships with Silicon Valley companies that stand to benefit from the policies she advocates,” Times business correspondent Sylvia Nasar notes. In support of these policies, Roundtable co-director Michael Borrus cites a 1988 Department of Commerce study showing that “five of the top six fastest growing U.S. industries from 1972 to 1988 were sponsored or sustained, directly or indirectly, by federal investment,” the only exception being lithographic services. “The winners” in earlier years, he writes, “computers, biotechnology, jet engines, and airframes were each the by-product of public spending for national defense and public health.” The record goes back to the earliest days; “defense” and “public health” are the familiar Newspeak disguises, perhaps a shade less deceptive than “free market neoliberalism.”

Such familiar lessons of economic history can no longer be concealed, as the Pentagon system and the Cold War ideology have eroded. The interventionist measures of the Reaganites reflect these needs, as does the increasingly open discussion of “industrial policy.” A recent study of the National Academy of Sciences and Engineering proposed a $5 billion quasi-governmental company “to channel federal money into private applied research”; that is, publicly-funded research that will yield private profit. Another report, entitled The Government Role in Civilian Technology: Building a New Alliance, calls for new efforts to extend “the close and longstanding” government-industry relationship that has “helped to establish the commercial biotechnology industry.” It recommends a government-funded “Civilian Technology Corporation” to assist U.S. industry to commercialize technology by encouraging “cooperative R&D ventures in pre-commercial areas”; “pre-commercial,” to ensure that profit is restricted to private wealth and power. The ventures will be “cooperative,” with the public paying the costs up to the point of product development. At that point costs change to gains, and the public hands the enterprise over to private industry, the traditional pattern.

“America cannot continue to rely on trickle-down technology from the military,” Clinton stated in a document issued by his campaign headquarters in September 1992 (“Technology: The Engine of Economic Growth”). The old game is ending. In the “new era” planned by the Clinton administration, Times science writer William Broad reports, “the Government’s focus on making armaments will shift to fostering a host of new civilian technologies and industries” — just as in the “old era,” but then behind the Pentagon mask. “President Clinton proposes to redirect $76 billion or so in annual Federal research spending so it spurs industrial innovation” in emerging technologies — which, in unmentionable fact, were largely funded through the Pentagon system (and the National Institute of Health) in the “old era.” A minimum of $30 billion is to be taken from the Pentagon’s research budget as a “peace dividend” over four years for these purposes, Broad writes, noting that: “Significantly, the initiative would spend the same amount of money as Star Wars, $30 billion, in half the time. ”

Also significantly, Clinton’s advisers knew all along that Star Wars was “only tangentially related to national defense” that its prime function was to serve as “a path to competitiveness in advanced technologies,” as publicly explained in Congressional Hearings (Clinton’s close associate Robert Reich, now Secretary of Labor, writing in 1985 in the New York Times under the heading “High Tech, a Subsidiary of Pentagon Inc.”). As noted earlier, the function of Star Wars as part of the system of public subsidy, private profit, was made clear to the business world from the start, though largely concealed from the general public by the doctrinal managers.

The Wall Street Journal reports a study by Battelle Memorial Institute showing that research spending will remain sluggish because of “a slowdown in weapons development.” “Government spending over the past five years has swung toward space and energy programs, and away from weapons development” the principal author of the report said. That is government spending (the public subsidy) shifted from one component of the Pentagon system to the others.

“We’re now going to develop an economic strategy much in the way we developed a national security strategy to fight the cold war,” Kent Hughes, president of Clinton s Council on Competitiveness, proclaimed. It is necessary only to bring out the striking continuities as old policies are adapted to new contingencies, and to reinterpret the “cold war” as what it was. A related matter is the traditional business demand that the public via government, pay the costs of the infrastructure required for private power and profit, everything from roads to education. By now, even such enthusiasts for Reagan’s party for the rich as the Wall Street Journal are concerned by the consequences of the policies they advocated, such as the deterioration of the state college systems that supplied the needs of the corporate sector. “Public higher education — one the few areas where America still ranks supreme — is being pounded by state spending cuts,” the Journal worriedly reports, echoing the concerns of businesses that “rely heavily on a steady stream of graduates” for skilled personnel and on applied research that they can exploit. This is one of the long-predicted consequences of the cutback of federal services for all but the wealthy and powerful, which devastated states and local communities. Class war is not easy to fine tune.

That Clinton will be able to address these problems is not at all clear. Frivolous Reaganite policies left the country deeply in debt at all levels, from the federal government to households. Interest on the federal debt has skyrocketed, now reaching the scale of the days when the costs of the World War had to be faced. Had the borrowing been used for productive investment or R&D, it could have been justified. But it was not. Rather, it was largely frittered away in luxury consumption, financial manipulations and swindles, and other Yuppie fun-and-games-much as in Thatcherite England, the other “revolution” much admired by the privileged. A National Science Foundation study at the peak of the mania estimated that R&D expenditures declined by 5 percent for companies involved in mergers and acquisitions compared to a 5 percent rise for others. Meanwhile real wages declined, hunger and deep poverty rose rapidly, the jail population zoomed, and the society began to take on a distinct Third World aspect. Given the debt, even the kinds of “moderate increase in infrastructure spending” and other devices that Clinton advisers are willing to contemplate, reflecting business concerns, may not be feasible.

Who Decides? For Whom?

The standard rhetorical cloak for the new “economic strategy” is that its goal is to provide jobs. That is not false, as long as we recall the meaning of the term “jobs” in Politically Correct Newspeak. Whether the strategy will provide jobs, and for whom, is debatable. What is not debatable is that the driving concern remains the unmentionable seven-letter word, and that the public is to be excluded, completely, from any participation in formulating this “economic strategy.” The latter principle follows from the guiding doctrine of elite democratic theory: the public are to be spectators, not participants in managing public affairs, which are none of their business. The urgency of preserving this principle is highlighted by the curious confusions that the public manifests, reviewed earlier.

The guiding doctrines, of course, have far more general application. To mention one interesting case, in Poland “Public resistance to privatization, especially among workers, has been evident since early in the post-Communist period,” the director of Russian and East European studies at George Washington University, Sharon Wolchik, observes: “A 1990 survey, for example, found that only 13 percent of workers, but 37 percent of directors, favored private ownership of their enterprise,” with over one-third of both workers and directors favoring state and employee ownership. But the attitudes of the population are inconsequential in the “new democracies” — one reason, perhaps, why “the Communist era is looking better and better” to Poles, as another academic specialist observes (Jane Leftwich Curry).

Whether in Somalia, or Poland, or any other choice that one may make, the concerns of the general population are as incidental to the architects of policy as in the days of Adam Smith’s England. And crucially, the rabble must be kept from interfering with the plans that will determine their fate.

While state managers may attempt to adapt the traditional devices of public subsidy and protection to new contingencies, they will surely continue to support the main lines of policy: extending the globalization of the economy and establishing more firmly the decision-making apparatus that is taking shape to serve the interests of the supranational industrial and financial institutions. These are important features of the current era, discussed in earlier articles here (see my Z articles in May, July/August, November, and Edward Herman’s “Doublespeak,” November 1992).

Nixon’s dismantling of the international economic system was one of several factors leading to a huge increase in unregulated capital, beyond the power of governments to control. The rich societies are no longer immune, as European central banks learned a few months ago. Even the United States, still the world’s largest economy and most powerful state, is facing these problems. The U.S. can freely disregard lMF “advice” as the Bush administration showed in October when the IMF prescribed deficit-cutting measures including new taxes, and “fundamental” health care reforms — the kind of “advice” on structural adjustment that amounts to orders for the Third World, however harmful the consequences, Doug Henwood notes, reporting the U.S. rejection. But it is not beyond the reach of international bond investors, who “may now hold unprecedented power — perhaps even a veto-over U.S. economic policy,” the Wall Street Journal reported immediately after the election. This consequence of the huge Reagan-Bush deficit will serve as brake on any odd ideas that Clinton advisers might have about spending, the Journal noted reassuringly; spending of the wrong kind, that is, not directed to the needs of “the country,” in the technical sense.

Related developments of the past several decades have accelerated the globalization of the economy, along with its immediate corollary: a growing superfluous population at home as production shifts to high repression, low wage areas (and, at the same time, productivity gains reduce the need for industrial workers). The superfluous people are becoming less significant as a market as well. Increasingly, production can be shifted to poor and oppressed populations and directed to the relatively wealthy, a small sector in the traditional Third World, a far larger one in the advanced industrial societies. The model pioneered by Henry Ford — wages high enough for domestic workers to provide a market — may decline along with the national economies on which it was based. Rhetoric aside, these are not likely to be serious concerns of the “principal architects” of policy, any more than they have been in the past.

The reversion of much of East Europe to its traditional Third World status offers new weapons against U.S. workers (and Western workers generally). As widely reported, GM plans to close two dozen plants in the U.S. and Canada. Meanwhile it bas opened a $690 million assembly plant in East Germany with great expectations, heightened by the fact that, thanks to 43 percent unofficial unemployment, workers are willing to “work longer hours than their pampered colleagues in western Germany” at 40 percent of the wage and with few benefits, the Financial Timescheerily explains. Capital can readily move; people cannot, or are not permitted to by those who applaud Adam Smith’s doctrines when it suits their needs. Jobs may disappear in the West; “jobs” in the technical sense will do just fine.

The U.S. (like other states) will continue to defend U.S.-based corporate and financial interests while seeking to maintain a global environment in which they can flourish. That requires, in particular, that the Third World be kept in its service role. Meanwhile at home, state power will continue to be employed to dissolve popular structures (unions, etc.) that might serve the needs of the general public and enable them to interfere illegitimately in the management of public affairs It will also be necessary to find ways to control the growing “Third World at home,” no small problem. The Clinton Mandate for Change promises no change in these respects.

Much of world trade (close to half, by some estimates) consists of intrafirm transfers — centrally managed trade, internal to particular TNCs and guided by a highly “visible hand,” to borrow the phrase of business historian Alfred Chandler. In an important critical analysis of the GATT World Bank economists Herman Daly and Robert Goodland point out that in prevailing economic theory, “firms are islands of central planning in a sea of market relationships.” “As the islands get bigger” they add, “there is really no reason to claim victory for the market principle” — particularly as the islands approach the scale of the sea, which departs radically from free market principles, and always has, because the powerful will not submit to these destructive rules.

As in the past, political institutions are taking shape to reflect the realities of private economic power: the IMF and World Bank, G-7, NAFTA, and other elements of the “de facto world government” described by the international financial press as the executive for the “new imperial age.” These processes allow major decisions to be insulated from parliamentary institutions, which may be infected by public influence This important development carries forward the long-term project of safeguarding wealth and privilege from public interference and overcoming the threat that democratic forms might have actual substance. Increasingly, the general public are not even aware of major decisions that will determine their fate, hence are in no position to influence them. A good part of the popular concern in Europe over instituting EC structures has to do with “the democratic deficit,” “the fact that policies escape parliamentary control at the national level and do not come under equivalent control at the Community level” (John Lambert). The same problems are arising here, though they are less discussed in our more depoliticized society.

Neither at home nor abroad does the real world bear much resemblance to the dreamy fantasies now fashionable among intellectuals about History converging to an ideal of liberal democracy that is the ultimate realization of Freedom. Consider NAFTA. One may debate the consequences, but no one doubts that they will be large in scale. The NAFTA is an executive agreement reached on August 12, 1992, just in time to become a major issue in the electoral campaign. It was mentioned but barely. The Trade Act of 1974 established a Labor Advisory Committee (IAC), based in the unions, which is required by law to provide advice and information to the executive branch before any trade agreement is reached. The LAC was advised that its report was due on September 9, J992. A complete draft of the text of this elaborate treaty was made available one day before, on September 8, making it impossible for the LAC to formally meet, as directed by law. One could hardly con conjure up a more striking example of utter contempt for democracy. Furthermore, the LAC notes, “the administration refused to permit any outside advice on the development of this document and refused to make a draft available for comment.”

The situation in Canada was similar. The British Columbia Teachers Federation wrote a sharply critical report on the treaty draft noting the “impossible limitations on the operation of this committee, “with absurd time constraints and exclusion of entire provinces from any review of the lengthy and complex executive agreement.

Despite the contemptuous dismissal of both the law and the public, LAC did provide a review of NAFTA, concluding that while the treaty would be a bonanza to investors (as all agree), it would severely harm American workers (about 70 percent of them, even by the analysis of the advocates). It will also very likely harm Mexican workers as well, the LAC report notes, as do other studies. One predicted consequence of the agreement is a rapid increase in rural migration to urban areas as Mexican corn producers are wiped out by U.S. agribusiness exports, depressing still further wages that have fallen some 60 percent during the past decade and are likely to remain low, thanks to the harsh repression of labor that is a crucial part of the highly-touted Mexican “economic miracle.” Property rights are well protected by the agreement, LAC and other analysts note, while workers’ rights are ignored.

The treaty is also likely to have harmful environmental consequences; production can shift to regions where enforcement of laws is lax or non-existent, and regulations imposed by parliamentary bodies can be overridden as “unfair restraint of trade,” processes already underway in the Framework of the U.S.-Canada “Free trade” agreement. In general, the LAC report concludes, “U.S. corporations, and the owners and managers of these corporations, stand to reap enormous profits. The United States as a whole, however, stands to lose an enormous amount.” The country will suffer, “the country” — in the Newspeak sense — will, again, do just fine.

On a wide range of issues, the LAC report observes, NAFTA “will have the effect of prohibiting democratically elected bodies at [federal, state, and local levels of government from enacting measures deemed inconsistent with the provisions of the agreement,” including measures on the environment, workers’ rights, health and safety, etc. The LAC report calls for the treaty to be renegotiated, offering a series of constructive proposals.

Neither the contents of this important critical analysis, nor the scorn for law and democracy shown by the Bush administration, were reported. These matters are of no interest to the ideological institutions, or more accurately, are of negative interest — suppression is necessary, in the interests of “the country.” Citizens know next to nothing; indeed, subversion of democracy has reached such remarkable heights that they do not even know that they know nothing. Congress abdicated responsibility. The Clinton camp had little to say. In such ways, we can approach the long-sought ideal: formal democratic procedures that are utterly devoid of meaning, as citizens not only do not intrude into the public arena, but have scarcely an idea of the policies that will shape their lives.

It is a striking fact that although these critical issues have been kept almost entirely out of the public domain, 60 percent of the public do have an opinion about NAFTA, opposing it by nearly 2-to-1 in October 1992. As usual, that was irrelevant to the presidential campaign, then in its final weeks. It was enough, however, to frighten the Wall Street Journal, which ran a fevered front-page story warning of the “diverse coalition of grass-roots foes” that is “fighting `Nafta’,” including the labor movement, populist farm groups, and environmental and religious organizations. These dangerous elements “hit pay dirt,” the Journal reports ominously, receiving funds (a magnificent $50,000) from a branch of the Unitarian Church.

The corporate world is, naturally, shaking in its boots at the thought that its monopoly might be challenged. The lesson for the rest of us is obvious.

More generally, people have little specific knowledge of what is happening around them. An academic study that appeared right before the presidential election reports that less than 30 percent of the population was aware of the positions of the candidates on major issues, though 86 percent knew the name of George Bush’s dog. The general thrust of propaganda gets through, however. When asked to identify the largest element of the federal budget, less than 1/4 give the correct answer, military spending. Almost half select foreign aid, which barely exists; the second choice is welfare, chosen by 1/3 of the population, who also far overestimate the proportion that goes to blacks and to child support. And though the question was not asked, virtually none are likely to be aware that “defense spending” is in large measure welfare for the rich. Another result of the study is that more educated sectors are more ignorant — not surprising, since they are the main targets of indoctrination. Bush supporters, who are the best educated scored lowest overall. The study also shows that Republican propaganda (however fraudulent) passed through the media with greater effect than the Democrats counterpart, an inconvenience for the charges of “liberal bias” that are particularly relished by the liberal media, which — as usual — greatly appreciate such condemnation as a tribute to their fiery independence of power.

With regard to all of these issues, two distinct questions arise. What will be the likely consequences of the policies under consideration? Who decides? The answer to the second question is clear: the “principal architects” of policy are the traditional ones. The public has essentially no role, and with the recent advances in destruction of democracy, no knowledge.

As noted, the first question can be debated. Perhaps, as the scant media coverage generally takes for granted reflexively, the NAFTA will benefit all; the Clinton and the Reagan-Bush factions are earnestly seeking to improve the lives and prospects of the general public, differing only on how to achieve this result; all are committed to free trade, which is obviously the greater good; etc. Maybe there really is a tooth fairy. Perhaps. No matter what one believes, it cannot be doubted that the policy questions require careful scrutiny and analysis. And that they will not receive, certainly not on the part of those whose lives and fate are at stake, unless they organize to do something about it.

https://chomsky.info/199302__/






Geen opmerkingen:

"Israel is burning children alive"

Khalissee @Kahlissee "Israel is burning children alive" "You are destroying this country shame on all of you" Ex U.S. ...