Tuesday, June 30, 2020

The World Rises Out of Extreme Poverty

Between 1750 and 1820, capitalism had its start in the Western World. It would prove to bring more prosperity than had ever been known -- ever. 

In 1800, 90% of the world's population lived in extreme poverty. In 2019, 10% did. During this 220 year span, 80% of the population of Earth rose out of extreme poverty.

According to the World Bank, people living in poverty have access to shelter, food, clean water, and basic services provided by the government or private entities. On the other hand, people who live in extreme poverty are severely deprived of basic human needs and often do not have access to service aids.
Most of a country's shift out of extreme poverty coincides with its industrialization.

The entire population of the U.S. has been effectively out of extreme poverty since about 1940. 

Most of China's population was stuck in extreme poverty until 1976, after which, the country suddenly began advancing out of it, slowly at first, then much more quickly, as if they had been unnaturally constrained before.

India's rise out of poverty is also impressive, though it started progressing earlier than China and has maintained a somewhat slower pace.

The largest country whose population still mostly falls below the extreme poverty line is the Democratic Republic of Congo, which though it has grown significantly in population, has fallen back behind other countries in its region in recent decades.

World GDP per capita was essentially flat for 2,000 years. It began increasing during and after the Enlightenment. Beginning about 1820, the rate exploded. 

Further Reading

The Origins of Power, Prosperity, and Poverty  What explains the enormous differences in income per capita that exist across the world today? The question has been posed many times over. The gaps in prosperity that surround us in the modern age are much wider than those that motivated Adam Smith to write The Wealth of Nations in 1776, which of course is where the modern discipline of economics began.
Why Nations Fail

Daron Acemoglu and James Robinson conclusively show that it is man-made political and economic institutions that underlie economic success (or lack of it). Korea, to take just one of their fascinating examples, is a remarkably homogeneous nation, yet the people of North Korea are among the poorest on earth while their brothers and sisters in South Korea are among the richest. The south forged a society that created incentives, rewarded innovation, and allowed everyone to participate in economic opportunities.

The economic success thus spurred was sustained because the government became accountable and responsive to citizens and the great mass of people. Acemoglu and Robinson explain how institutions are key to prosperity, and how bad or extractive institutions prevent prosperity. 
 The Birth of Plenty

In The Birth of Plenty, William Bernstein, the bestselling author of The Four Pillars of Investing, presents his theory of why prosperity has been the engine of civilization for the last 200 years.

This is a “big-picture” work that highlights and explains the impact of four elements that when occurring simultaneously, are the fundamental building blocks for human progress:
  • Property rights, which drive creativity
  • Scientific rationalism, which permits the freedom to innovate without fear of retribution;
  • Capital markets, which provide funding for people to pursue their visions;
  • Transportation/communication, which allows for the effective transfer of ideas and products.

 The Age of Abundance 

Until the 1950s, the struggle to feed, clothe, and employ the nation drove most of American political life. From slavery to the New Deal, political parties organized around economic interests and engaged in fervent debate over the best allocation of agonizingly scarce resources. 

But with the explosion of the nation's economy in the years after World War II, a new set of needs began to emerge—a search for meaning and self-expression on one side, and a quest for stability and a return to traditional values on the other.

Lindsey offers a bold reinterpretation of the latter half of the twentieth century. In this history of postwar America, the tumult of racial and gender politics, the rise of the counterculture, and the conservative revolution of the 1980s and 1990s are portrayed in an entirely new light. 

 How the West Grew Rich

How did the West—Europe, Canada, and the United States—escape from immemorial poverty into sustained economic growth and material well-being when other societies remained trapped in an endless cycle of birth, hunger, hardship, and death?

In this elegant synthesis of economic history, two scholars argue that it is the political pluralism and the flexibility of the West's institutions—not corporate organization and mass production technology—that explain its unparalleled wealth.

This book is a summary of the factors which were necessary to the breakthrough, or series of economic breakthroughs (they actually mention at least five -- the expansion of trading and markets, the first and second industrial revolution, the modern information/technology revolution, and science).

Sunday, June 21, 2020

History is Important, But No Longer Taught

"From this I conclude that the best education for the situations of actual life consists of the experience we acquire from the study of serious history. For it is history alone which without causing us harm enables us to judge what is the best course in any situation or circumstance." -- Polybius, The Rise of the Roman Empire (about 100 BCE).

A survey by the American Council of Trustees and Alumni found that “more Americans could identify Michael Jackson as the composer of ‘Beat It’ and ‘Billie Jean’ than could identify the Bill of Rights as a body of amendments to the U.S. Constitution,” “more than a third did not know the century in which the American Revolution took place,” and “half of the respondents believed the Civil War, the Emancipation Proclamation or the War of 1812 were before the American Revolution.” Oh, and “more than 50 percent of respondents attributed the quote ‘From each according to his ability to each according to his needs’ to either Thomas Paine, George Washington or Barack Obama.” 

Only 40 states require a history course to graduate high school. Only 15 require an examination. 

Historians may not want to admit it, but they bear some blame for the increasing irrelevance of their discipline. As historians Hal Brands and Francis Gavin argue on the national security site War on the Rocks, since the 1960s, history professors have retreated from public debate into their own esoteric pursuits. The push to emphasize “cultural, social and gender history,” and to pay “greater attention to the experiences of underrepresented and oppressed groups,” they write, has been a welcome corrective to an older historiography that focused almost entirely on powerful white men. But like many revolutions, this one has gone too far, leading to the neglect of political, diplomatic and military history — subjects that students need to study and, as enrollment figures indicate, students want to study but that universities perversely neglect. Historian Jill Lepore notes that we have ditched an outdated national narrative without creating a new one to take its place, leaving a vacuum to be filled by tribalists.

Students will learn what educators value. And if recent scores on national exams for history, geography, and civics are any indication, American educators are undervaluing the knowledge that young people will need to protect our political and civil institutions as adults.

The National Assessment of Educational Progress (NAEP) recently released its 2018 assessment results for eighth-grade students in civics, US history, and geography. The results are not encouraging: compared to scores from 2014, average scores on the geography and history exams in 2018 saw a statistically significant drop of three and four points, respectively. Average scores on the civics exam declined by one point. The average score on the NAEP history exam is at its lowest point since 2006, and the average score for geography is at its lowest point since the subject became a main segment of the assessment.

In addition to a decline in scores, the new NAEP scores also reveal a lack of proficiency in all three subjects. Roughly a quarter of all US students who took the test scored well enough to be labelled “proficient” or better in civics or geography. In history, only 15% scored at proficiency or better.

Another concern: the influence of postmodernist thought in much historical debate. In brief, it’s a notion borrowed from language theorists that no textual interpretation can be “privileged” above another. That, some historians say, might be fine for literature classes. But it’s disastrous in historical thinking and even more so in public life, where it presages the idea of “alternative facts,” to borrow a phrase that’s emerged in recent political discourse.

“Americans can argue over different interpretations because history and civic issues are complex, and there are different ways to think about them,” Wineburg said. “But the basis has to be an agreement of what constitutes facts.”

To become strong and productive members of society, students need a strong grasp of our country’s history, a deep understanding of our country’s system of government, and broader knowledge of the issues impacting the world. If educators wish to see a rise in proficiency in subjects like civics, they must dedicate time to these subjects.

History Teacher Throws Out Textbooks For Radical Marxist ‘History’ Book

by Evita Duffy

A young high school history teacher, named Annie, posted a TikTok video raving about Howard Zinn’s “A People’s History of the United States,” which she explains will be her new classroom textbook.

In her TikTok video, posted in the wake of the George Floyd riots, Annie explains she will no longer be using official textbooks in her classroom because they “omit a lot of the truth from our past.” Ironically, “A People’s History” is replete with factual omission to twist history in order to fit a narrative of American shame. For example, according to “A People’s History,” the American Revolution was waged in order to defeat “potential rebellions and create a consensus of popular support for the rule of a new, privileged leadership”. Civil War soldiers fighting to preserve the Union were deceived by “an aura of moral crusade” against slavery which “worked effectively to dim class resentments against the rich and powerful, and turn much of the anger against ‘the enemy.'”

Interestingly, Howard Zinn made no bones about his motivation and goals for writing “A People’s History.” Reputable historians attempt to teach history without being selective and misleading. Zinn on the other hand, openly admitted that he became a historian in order to inspire a social revolution, “I came to history with a very sort of modest objective, I wanted to change the world.”

Zinn’s young followers, who are consumers of his left-wing reinterpretation of the United States, see America as morally flawed. “If people knew history,” Zinn proclaimed, “they would scoff at the idea that the United States is a force for the betterment of humanity.” Unsurprisingly, the American left has enthusiastically embraced “A People’s History” and Zinn’s belief that the United States is, at its core, and from its inception, a sexist, racist, xenophobic, and bigoted nation. America, according to the left, is irredeemable, and her only hope is for a revolution that will fundamentally transform it.

Annie’s rejection of classic, credible history textbooks for overt leftist propaganda may seem shocking, but it is actually a perfect example of the new normal in the American education system. Stanford University School of Education Professor Sam Wineburg explains how “A People’s History” went from being a quirky outlier to the new accepted view of American history, “In the 32 years since its original publication, ‘A People’s History’ has gone from a book that buzzed about the ear of the dominant narrative to its current status where, in many circles, it has become the dominant narrative. For many students, ‘A People’s History’ will be the first full-length history book they read, and for some, it will be the only one.”

Annie stressed, “These people’s stories need to be heard. They need to be shared, because this is what built America and this is why we are the way we are.” In a way, it’s hard to argue with Annie. The fact that the Marxist-inspired “A People’s History” is the new standard in American History curriculum, is precisely the reason why we are the way we are in the summer of 2020. The race riots, destruction of historical monuments, the Pulitzer Prize winning 1619 Project, and the damaging notion of collective American guilt and specifically, white guilt, are all products of the Zinn narrative.

The culture of American shame composed by Howard Zinn has permeated pop culture inspiring disturbing videos of young, white people expressing shame for their race and accepting responsibility for the sins of America’s past. It’s even being embraced by corporate America, who has been a massive funder of the Black Lives Matter movement. This past week, Chik-fil-A’s CEO suggested that white Americans should shine the shoes of black Americans as atonement for America’s racist history.

The root of all this though, is in America’s teachers colleges which have been quietly radicalizing over the course of many decades. When Zinn wrote “A People’s History,” even he would be surprised by how quickly his ideas have been accepted and wholeheartedly embraced in American high schools and even elementary schools. Teachers, like Annie, are passing these distorted and politically-infused ideas of history to their students with little or no critical analysis or opposing points of view.

It will take a massive effort on the part of honest historians to undo the damage done by Howard Zinn and restore the truth about America’s founding, shortcomings and proud history as the architects of freedom and self-government.

Why Students Have Turned Away from History

by David Kaiser

In the 1970s and 1980s, when social history became fashionable, its practitioners sold it as an attempt to learn more about workers, peasants, and other less-visible social sectors that traditional political history had tended to slight. Feminists and nonwhite scholars picked up that ball and ran with it, arguing that they represented identities that white male historians had ignored, and whose voices now needed to be heard.

By the turn of the new century, even to study the political leadership of Western countries in detail had become suspect in history because it supposedly reinforced white male hegemony in society.

The long-term impact of those changes emerges when one looks at what historians do study today. The program of the last annual meeting of the American Historical Association lists 300 different panels on different historical topics. Only 15 of those 300—2.5 percent—deal with political history.

We must, however, look at those panels individually to understand what “political history” now means.

The sessions dealt with:
  • the funding of Sesame Street in the 1970s;
  • the authorship of Wikipedia articles about women’s suffrage in the US;
  • ideas of female monarchy in the Middle Ages;
  • the intellectual influence of the right after 1945 in various countries;
  • the recent immigrant rights movement in the US;
  • Fascist and Communist ideas of war during the Sino-Japanese conflict in the 1930s;
  • several populist episodes in recent American politics;
  • a panel discussion of historians and presidential misconduct;
  • various nonwhite feminist political movements;
  • a panel on the gender of power;
  • the politics of gun control;
  • women and religious liberty in early America;
  • a panel on writing the history of American conservatism under Donald Trump; and
  • human rights and state constitutions, 1796-1861.
In short, only three panels touched on major national issues in the US, and not a single one deals with a Western European political issue of any kind. None dealt with presidential leadership, the passage and impact of a major piece of legislation, or the origins, course, and results of war.

Because of this shift, we know much less about the politics and diplomacy of the last 40 years or so than we do about earlier periods. Whereas dozens of serious archival books had been written on the politics of the 1930s and 1940s by the time I was in graduate school, there are practically no serious studies of US political and diplomatic history since 1980 or so today.

Almost no one is either trained to write them or given a tenure-track job for having done so.

I had been teaching the history of warfare at the Naval War College for 16 years in 2006 when a political scientist at Williams College invited me to spend a year in a new chair in American diplomatic history that he had managed to create. I found later that when he initially floated his plans to the chair of the history department, she asked why he wanted to do that, since “that’s not what historians do anymore.”

Yet, during my year there, the courses I taught on the US and the two world wars and on Vietnam were extremely popular, and some students regretted that there were not more of them available.

Meanwhile, I saw the impact of the changes reflected in AHA programs on undergraduate curriculums. As departments became larger and faculty became more specialized, the distinction between undergraduate and graduate education was lost.

A historian of gender and sexuality in France (to select a random example that does not refer to a specific individual) offered undergraduate courses on gender and sexuality in France, without feeling any obligation to educate students about critical political events. Such courses predictably drew small enrollments, but faculty didn’t care.

At departmental lunches, I heard faculty report that their class had half a dozen students in them without a shred of embarrassment—much less any analysis of whether their contribution to teaching was earning their salary.

At one such meeting, a prominent faculty member plugged a talk by a visiting British historian about the significance of the powder puff in 1920s Britain. The talk was built around an arrest of a suspected gay man who was carrying a powder puff, and the presenter riffed on industrialization, consumerism, commodification, and transgressive sexuality.

A few days later I asked a student who had been there what he thought about it. He had more traditional historical interests, but he said that 90 percent of the history courses at Williams were of that type.

Now in retirement, I have embarked upon a new project: a political history of the United States based upon the inaugural addresses and State of the Union addresses of our presidents. I have been reminded that from Washington forward, American political leadership and the people saw themselves as conducting a great experiment in free, representative government, which might set an example for the world.

One doesn’t have to view American history uncritically or ignore our frequent failures to live up to our ideals to regard this story as a fascinating and inspiring one. Yet that is the story that most university history courses today choose to ignore, in favor of meditations that reflect the personal interests of the faculty rather than the needs or interests of the students.

That is why history and the humanities have lost the central place they occupied in our universities a half-century ago, and why they will have so much trouble regaining it.

Why schools have stopped teaching American history

“Don’t know much about history . . .,” goes the famous song. It’s an apt motto for the Common Core’s elementary school curriculum.

And it’s becoming a serious problem.

A 2014 report by the National Assessment of Educational Progress showed that an abysmal 18 percent of American high school kids were proficient in US history. When colleges such as Stanford decline to require Western Civilization classes or high schools propose changing their curriculum so that history is taught only from 1877 onward (this happened in North Carolina), it’s merely a blip in our news cycle.

A 2012 story in Perspectives on History magazine by University of North Carolina professor Bruce VanSledright found that 88 percent of elementary school teachers considered teaching history a low priority.

The reasons are varied. VanSledright found that teachers didn’t focus on history because students aren’t tested on it at the state level. Why teach something you can’t test?

A teacher I spoke with in Brooklyn confirmed this. She said, “All the pressure in lower grades is in math and English Language Arts because of the state tests and the weight that they carry.”

She teaches fourth grade and says that age is the first time students are taught about explorers, American settlers, the American Revolution and so on. But why so late?

VanSledright also found that teachers just didn’t know enough history to teach it. He wrote there was some “holiday curriculum as history instruction,” but that was it.

Arthur, a father in Brooklyn whose kids are in first and second grade at what’s considered an excellent public school, says that’s the only kind of history lesson he’s seen. And even that’s been thin. His second-grade daughter knows George Washington was the first president but not why Abraham Lincoln is famous.

As the parent of a first-grader, I’ve also seen even the “holiday curriculum” in short supply. First grade might seem young, but it’s my daughter’s third year in the New York City public school system after pre-K and kindergarten. She goes to one of the finest public schools in the city, yet knows about George Washington exclusively from the soundtrack of the Broadway show “Hamilton.” She wouldn’t be able to tell you who discovered America.

So far, she has encountered no mention of any historical figure except for Martin Luther King Jr. This isn’t a knock on King, obviously. He’s a hero in our house. But he can’t be the sum total of historical figures our kids learn about in even early elementary school.

For one thing, how do we tell King’s story without telling the story of the Founding Fathers, the Constitution or of Abraham Lincoln? King’s protests were effective because they were grounded in the idea that America was supposed to be something specific, that the Constitution said so — and that we weren’t living up to those ideals.

The Brooklyn teacher I spoke with says instructors balk when it comes to history: They don’t want to offend anyone. “The more vocal and involved the parents are, the more likely the teacher will feel uncomfortable to teach certain things or say something that might create a problem.” Which leaves . . . Martin Luther King.

She cited issues around Thanksgiving, like teaching the story of pilgrims and the Native Americans breaking bread together as one that teachers might sideline for fear of parents complaining. Instead of addressing sticky subjects, we skip them altogether.

As colleges around the country see protests to remove Thomas Jefferson’s statues from their campuses, it’s becoming the norm to erase the parts of history that we find uncomfortable. It’s not difficult to teach children that the pilgrims or Thomas Jefferson were imperfect yet still responsible for so much that is good in America.

Jay Leno used to do a segment on his show called “JayWalking,” where he’d come up to people on the street and ask them what should’ve been easy historical questions. That their responses were funny and cringeworthy enough to get them on the show tells you how well it went.

Leno never asked the year the Magna Carta was published or when North Dakota became a state. He would ask what country we fought in the Revolutionary War, to name the current vice president or how many stars are on the American flag. And yet adults had no idea.

We talk often about how fractured our country has become. That our division increases while school kids are taught less and less about our shared history should come as no surprise.

My Own Brief Conclusion

There will always be debate and discourse over history, its methodology, what should be taught, how it should be taught, and who should teach it, among other things.

But when you watch an interview of college students (freshman) who can't tell you who won the Civil War, you have to cringe. 

From a postmodernist philosophical perspective, it is dangerous to teach history because events can be interpreted in an infinite number of ways. There is no "truth" or "facts." From my perspective, postmodernist philosophy is itself a dangerous outlook on the world, as there is no canonical narrative. How do you orient yourself to the world when anything is relative, nothing is real?

The teaching of history must be re-invented, and must be divorced from political viewpoints or ideology. While the teaching of dates and events can be a foundation, it must be taught and learned from a deep understanding of the context of events. Why did people do what they did? What lead them to create the worlds they inhabited in the past? How did they think? What was their worldview? How are these different from today? 

Only then can we know how to go forward. 

Sunday, June 14, 2020

Should I Pay Extra on My Mortgage?

If you mean do I pay extra toward my principle, then yes. While many financial advisors — and many “armchair” internet “experts” — will tell you not to do this, because you can get a better return in the market, that’s not always the best case. That you have to figure out yourself.

Here’s a couple of reasons to starting paying down your mortgage:

  1. You don’t have enough equity (the difference between what you owe, and the home’s value). If you don’t have at least 20%, pay it down.
  2. You’re getting closer to retirement, you’re debt free, and you’re saving at least 10 percent toward retirement. Use the extra funds to pay it down. If you hit retirement with a home free and clear, you’re going to be glad you did.
  3. Having no mortgage, especially if you’re already retired, gives you the option of using the equity in your home for retirement funds, such as a reverse mortgage. I wouldn’t recommend this for everyone, but it is an option. (Having a sufficient retirement account and income is the best plan.)

I built my retirement house the year I retired. This may seem weird to many, but during my get-out-of-debt phase, I had sold my house — and my rental property, which was not really generating much cash flow— and rented to lower my expenses. I choose to buy a home again to lock in my housing costs, and the added benefit of building just what I wanted. (I assumed that rents would continue to go up, while a mortgage is pretty predictable).

For the last three years, I have used option #1, because I only put 5% down on my VA loan (I could have put more down, buy why tied up all the extra money right off the bat?) However, now that I have reached 80% equity value, I just reduced my extra principle payment so that I’m paying at least as much principle as interest. I don’t need to do this, but thought I’d keep the principle moving downward faster than regular loan payments. And I can adjust the extra principle downward each year. The extra funds can now go back to savings/investments.

The adage that you can earn more in the market than the cost of your mortgage is not necessarily true for everyone. There are many variables: 1) predicted market returns for your portfolio, 2) the current interest rate environment and 3) your mortgage interest rate.

While I choose option #1, at least for the early years of the mortgage, this may not make sense for anyone else. You have to sit down and do the work.

Critical Financial Steps When Buying a Home

Refinancing your VA Loan

Thursday, June 11, 2020

An Emergency Fund is Not Optional

There is nothing that will derail your goals of becoming financially independent, or financially solvent, or debt-free, or on track for a great retirement, than not having an emergency fund. 

As the name suggests, an emergency fund is a stash of money set aside to cover living expenses in case of an emergency like a job loss, unexpected medical need or last-minute car repair. But how much money you should have in that fund depends on your income and your financial obligations, especially basics such as rent, utilities and food.

Your emergency fund should be used only as a last resort for real emergencies, once other strategies such as reducing your expenses have been exhausted. Making that commitment will mean you'll have access to this money when you truly need it, something you'll be grateful for.

Your emergency fund should be separate from your regular checking and savings accounts, and it should be filled with money only for emergencies. It's best to put your emergency money in an account that earns interest instead of stuffing it under your mattress. Do not use it to invest in the stock market.

If an emergency such as job loss unfortunately arises, you can pull money from the fund to make sure you can keep a roof over your head, stock up the refrigerator and make all your bill payments on time. Since an emergency fund doesn't need to be paid back with interest, it's usually the wiser choice over other potentially costly methods of borrowing money, such as withdrawing money from a retirement account, using a personal loan or relying on your credit cards.

Initially, if you don't have an emergency fund, set a goal of $1,000. This may seem small, but you have to start somewhere, and $1,000 seems more obtainable than $20,000 or more. Once you've reached $1,000, you can add to that monthly until you've reached three to six months of living expenses. Even $1,000 can be a financial boon if the car needs a repair or the water heater goes out. If you have to use some of these funds, be sure to pay it back as soon as possible. 

Everything financial begins with a budget

Starting an emergency fund begins with setting a budget. Calculate your monthly income and expenses: How much money is coming in and how much is going out?

A spreadsheet or a budget app can serve as the foundation for your plan, enabling you to carefully track your spending and stay on top of how much money you can carve out for your emergency fund.

Once you've done the math, look at how much money is left after taking care of basic living expenses, credit card bills, loan payments and so forth. Does the math indicate you're basically living paycheck to paycheck? If so, it's time to find ways to cut costs:
  • Tackle your credit card debt. When you reduce or eliminate that debt, you can free up money for the emergency fund. Paying down your credit cards can also help you raise your credit scores.
  • Decrease discretionary spending. This is spending on things other than the necessities, such as your gym membership or entertainment. Look at where you could stand to cut back here, and put this "found" money toward your emergency fund.
  • Find ways to reduce your bills. Making a call to your cellphone provider, insurance agent and other businesses you make payments to every month can sometimes result in savings if you simply ask. Some companies can provide special rates for customers who call to cancel an account, for instance.
When you have your budget set, and your savings plan outlined, the "set and forget" method can pay off. By setting up automatic transfers from your bank account to your emergency fund, you won't need to think about making those deposits. You might even look into a direct-deposit arrangement, allowing you to automatically steer a portion of your paycheck to your emergency fund.

An emergency fund can be a lifeline when you confront unexpected circumstances, allowing you to have a place to live, put food on the table, pay utility bills and keep pace with credit card and loan payments.

Establishing an emergency fund can be one of the best gifts you can give your future self. And it can be simple to do, as long as you put together a budget, watch your spending and stick to your savings goal.

Tuesday, June 9, 2020

Using Stop Orders as Part of Your Investing Strategy

Update, Jun 11: As I noted below, a danger in stop orders is that if the prices gaps below your indicated sell price, the order will be executed lower than you anticipated. This was the case today for SCHD, the example used below. The stop price was $54.45. However, the market gapped down at opening, and SCHD sold at 53.63. (See chart below). I still kept a profit of $840, but this was slightly lower than I had planned. But if the market continues to sell off, I will have protected at least most of my profit in the position. I have the option of buying back in, if I think the market will reverse.

Have you ever sold a profitable position only to see it advance once again, leaving you in the dust? For example, you buy AMD at $19, then sell it at $32. That's a nice profit, and I'd be happy. But, shortly after you sell, over the next few weeks, it takes off like a rocket and advances to $55.

One method to manage this situation is through the use of sell stop orders. A sell stop order tells your broker to sell shares if the price meets or exceeds the price you set. Let's look at an example. 

AMD is currently at $33 a share. You don't know which way it will go, and you have a $12 per share profit. You could sell, but you could also place a sell stop order. So you place a stop order at $31 a share. If the prices hits $31 or lower, your broker will sell your shares (or how many you specified; you don't have to sell your entire position). If the price goes up, which it did, nothing happens. And you could cancel  your stop loss order and enter a new one at a higher price. Some brokers will allow you to place a trailing stop. In other words, you could specify $2 per share, or a percentage. 

Let's look at a real-world example. I own 200 shares of SCHD, which is the Schwab U.S. Dividend Equity ETF, which holds Exxon, Home Depot, Intel, Bristol Myers, Texas Instruments and 92 other stocks.

Today, June 9, SCHD closed at $55.77. I originally purchased it for $49.77, so I have a small profit of $1,200, which I'd like to protect. I could sell, but the if the market continues up, I would lose out on these gains. So I put in a sell stop order with my broker to sell at $54.45, when the price was $55.72. I placed my stop about 3.5% below the current price. You can use any type of formula you'd like. A popular one is to place the stop 1 or 2 percent below the 50-day moving average, but I like to be a bit tighter. If I get stopped out, I can always buy in again. Not the best way to manage positions, as it contributes to higher taxes because of possible short-term gains, but I'd rather pay a bit more in taxes than allow a winning position turn into a losing position. 

The illustration below shows my order screen. Note that I only had to change the order type from Market to Stop and enter the price of my stop. 

Note: There is a danger with stops. If the prices gaps down, say to $54, your stop will be executed at the lower price. So be aware of this. You can place a Sell Order called a Stop-Limit. You can read a full explanation of the difference between Stop and Stop-Limited orders at Investopedia.com. I recommend you do. You should educate yourself and feel comfortable with these types of orders. 

Example of Stop Order

Happy Investing. 

Saturday, June 6, 2020

Factfulness: Ignorance about global trends. The world is actually getting better.

This newsletter was powered by Thinkr, a smart reading app for the busy-but-curious. For full access to hundreds of titles — including audio — go premium and download the app today.

From the layman to the elite, there is widespread ignorance about global trends.

Author and international health professor, Hans Rosling, calls Factfulness “his very last battle in [his] lifelong mission to fight devastating global ignorance.” After years of trying to convince the world that all development indicators point to vast improvements on a global scale, Rosling digs deeper to explore why people systematically have a negative view of where humanity is heading. He identifies a number of deeply human tendencies that predispose us to believe the worst. For every instinct that he names, he offers some rules of thumb for replacing this overdramatic worldview with a “factful” one.

In 2017, 20,000 people across fourteen countries were given a multiple-choice quiz to assess basic global literacy. Examples of questions were:
  1. Does the majority of the world population reside in low-, middle-, or high-income countries?
  2. Is the average life expectancy in the world 50, 60, or 70 years?
  3. How many of the world’s 1-year-olds have been vaccinated against some kind of disease? 20 percent? 50 percent? 80 percent?
Not a single person in the 20,000-strong sample got all of the questions right. The average score was 2 correct out of 13 questions. All questions were straight-forward, free from vagary or trickery of any kind. All participants were educated professionals, including government officials, business professionals, doctors, journalists—even a Nobel prize winner.

Even at the World Economic Forum in 2015, in a room filled with a collection of high-ranking government and UN officials, professors, researchers, activists, and A-list reporters, most failed to correctly answer the three basic questions about global trends that were posed in the author’s lecture. This was a high-profile international affair, filled with people who have a vested interest in matters of global development.

A chimpanzee could do better than the average participant. That’s not a joke—it’s basic statistics that a chimp would likely get 4 or 5 out of 13 correct just by pressing random buttons. This means that, for some reason, we are systematically selecting the more pessimistic (and wrong) answers to test questions. What is shaping our intuitions that the world is such terrible place?

An outdated, overdramatic view of global affairs is hurting us.

Getting beaten by chimps in global literacy assessments is a bit embarrassing for humanity, but there’s hope! We are arriving at a better understanding of reasons for the underlying widespread ignorance. One factor is that our beliefs are outdated. Even after just a decade or two, the world can be a very different place. Consider one of the quiz questions about extreme poverty: In the past 20 years, has extreme poverty stayed the same, doubled, or been cut in half? The answer (which the vast majority got wrong) is that poverty has been halved in only two decades. This is arguably one of the most significant milestones of our time, and yet most people remain unaware of it.

Another factor is that we humans have a flair for the dramatic. Throughout our history, we have been drawn to epic stories of good overcoming evil and us-versus-them thinking. This impulse is deeply embedded within us, but it means that subtleties are missed because they just don’t have the staying power of a dramatic narrative.

Ignorance is not bliss. Our lives would be less stressful if we made a concerted effort to learn the facts and check our tendencies like focusing on the negative, playing the blame game, making false generalizations, blowing things out of proportion, and making everything ultra urgent. By doing so, our worldview will become less dramatic and more “factful.”

When we compare extremes, we forget that the majority of people are somewhere in the middle

Before despairing or decrying a so-called “gap” in society, look for the majority. Chances are between the poorest and wealthiest, there is a sizable number who are comfortably middle-class.

Beware of averages. They can be misleading and convey the sense that every single poor person earns x amount of money while every single wealthy person earns x + y. Averages give no account for how widely spread the data is. Another example would be the pay gap between men and women. It is typically presented in terms of two figures, with men’s income figuring slightly higher than women’s. But if you look at the same data on a standard deviation graph, which shows not only averages but the highest and lowest earners as well, it becomes obvious that there is almost complete overlap between the bell curves displaying men’s and women’s earnings. The averages by themselves can give the impression that every single man earns more than every single woman, but one important truth that the standard deviation graphs help keep in perspective is that there are plenty of women who earn a lot and plenty of men who earn very little.

Be careful when comparing extremes for the same reasons. By comparing the ultra-rich in Beverly Hills to the poorest in a famine-stricken region of East Africa, we lose sight of the fact that, on a global scale, the majority of the world lives somewhere in between. Examining only the top and bottom will give us a distorted picture of what the world is like.

The global consensus is that the world is getting worse—and they’ve got it all wrong

People in 30 countries were recently asked if they thought the world was getting better or worse. There was not one country in which the majority of those polled thought the world was getting better. A majority—and in some cases a vast majority—believe the world is getting worse.

But is it getting worse? About a century ago, there were no countries where a woman’s vote counted equal to a man’s. Now there are 193 countries that allow women to vote and consider their votes equal in weight. Two centuries ago, 2 percent of the world’s population lived in democracies. Now, 56 percent enjoy democracy. The incandescent light bulb was invented a bit over a century ago, and electricity was available only to the elite. Now 80 percent of the world has electricity. Enrollment of girls in primary schools has gone up from 65 percent in 1970 to 90 percent today. In under 40 years, access to water from a protected source has risen from 58 percent to 88 percent. In two centuries, global literacy rates have risen from 10 percent to 86 percent.

These are just a handful of the vast improvements that have taken place on a global scale. These changes are not insignificant.

No wonder we’re stressed and anxious— we believe the world is going to end up in the toilet. Who would have thought that statistics could be so therapeutic?

Before getting angry over an isolated statistic, find a standard of comparison

Around the world, 4.2 million infants died in 2017. Death of young lives is certainly sad, and if you had a way of visiting each of the grieving families that will never see their child learn to walk, run, and laugh, then you would have tears to cry for years. But tears alone will not bring about any change. What is more, this figure is actually extremely low. 4.2 million seems like a lot until we view the number in perspective. In 1950, that number was 14.4 million—and that was when the world’s population was a third of today’s. By adding just one more figure, humanity’s trajectory goes from tragedy to cause for celebration.

We humans are in the habit of blowing things out of proportion. It comes pretty naturally to us. We also tend to assign great significance to singular events. The media knows these tendencies and can sometimes run away with them. If there is a story of a tragic death, the media knows we feel insensitive ignoring it. Or take the shocking statistics—made all the more shocking for being ripped from their contexts—that journalists put in their headlines. These stories are often spun in such a way as to make them seem more significant than they actually are.

It’s best not to rely on a single isolated statistic. Gather a cluster of related statistics to build a frame of reference. The power of an isolated statistic is that it can shock and outrage because it might sound high or low. Thus, we should view statistics with suspicion until we have a standard of comparison. Questions like how rates in other countries compare or what the figure was last year, a decade ago, or a century ago ground our understanding.

Another rule of thumb to avoid being unnecessarily flabbergasted is to focus on statistics that give rates rather than raw numbers. Rates per capita will give the most accurate standard—far more accurate picture than sheer volume for an entire country or continent. Rates help turn these into manageable chunks of information that we can wrap our minds around and readily compare.

We can’t stop generalizing, but we can get better at it

We are categorizing, generalizing creatures. This tendency isn’t going anywhere, so it’s futile trying to eradicate it. What we should beware of isn’t generalizing, but doing a bad job of it. To keep a check on the generalizing tendency, you must be willing to question your categories.

Be cautious when people discuss “majorities.” When a headline refers to a “majority,” dig deeper. Both 52 percent and 98 percent are majorities, but that’s a huge difference. Beware of accounts that rely on vivid examples to make a case. These stories have staying power in the mind, but that doesn’t make them typical.

A final suggestion would be to learn from others. Operate under the assumption that people aren’t stupid. This fosters a sense of humility and curiosity about the world around you, and greater potential for others to expand yours. It will be through collaboration and sharpening each other’s ideas that we arrive at ideal solutions.

When we play the blame game, no one wins and nothing gets solved

We’re in the habit of pinning blame on power-hungry politicians and greedy corporate heads. The author describes an exchange with his class after announcing that he would be having a conversation with the CEO of a large pharmaceutical company. “Punch him in the face!” one student called out. The author indulged the hypothetical situation, asking what he should do after punching the CEO. Eventually the blame shifted from the CEO to the board members. But it was eventually agreed that punching three of four board members before security was called in would not be effective either, that even if the board members were removed, a set of like-minded men and women would be selected to take their place. So maybe it was the stockholders of this public company, those whose funds were perpetuating a business that looks after the interests of the wealthy instead of the poor. In the case of the pharmaceutical company, it was typically the elderly investing because pharmaceutical stocks tend to be more stable.

“So you may have to punch your grandmother,” the author told the student. He added that the extra money that grandmother gave for hiking trips last summer quite possibly came from company dividends. It was thus concluded that the student might have to punch himself in the face.

The point of this anecdote is that blame is something we’re eager to assign (to others, anyway), but it is often very difficult to isolate the guilty party. The blame game is easier to play than admitting complexity.

During the four decades that China’s one-child policy was in effect, the rate never dropped below 1.5 children per woman; whereas, some countries that have no such policy—like Hong Kong, South Korea, and the Ukraine—had ratios that dropped below one child to one woman. Or consider claims that the pope wields significant power over the world’s Catholic community. If this is true, then why are contraceptives used more in Catholic-majority countries than the rest of the world?

The point is that, while Mao had and Francis has strong political and moral influence, there are other factors at play.

So before you stand on principle and punch grandma in her face (or a CEO or a board member or a journalist), ask yourself what you would gain by accusing that person or group of people? Scapegoating is a cathartic experience, but rarely results in positive change. Identify causes and systems rather than villains.

Actions taken from a position of urgency and panic usually cause more problems than they solve

The salesman and the activist both tell us that now is the time to act, that tomorrow might be too late. In doing so, they are exploiting another deeply embedded human tendency, one that requires conscious effort to curb: urgency. When there is an issue, we tend to believe that everything has to change immediately. This is an unrealistic, stressful way to approach problems, whether personal or global. It is symptomatic of an overdramatic world.

Extreme poverty, for example, is slowly but surely being eradicated, but the process has been a marathon—not a sprint.

To be sure, there are global concerns that we should take seriously, like flu pandemics, financial implosion, climate change, and World War III. The point is not to ignore the belief that something should be done, but to remind us that unintended negative consequences often follow in the wake of rash action. It will take global collaboration, pooling resources, sharing the latest in independent research, taking baby steps, meticulously evaluating those baby steps, and deliberate, thoughtful action toward these pressing risks to lessen them and make the world a safer, healthier, wealthier place.

This newsletter was powered by Thinkr, a smart reading app for the busy-but-curious. For full access to hundreds of titles — including audio — go premium and download the app today.

Friday, June 5, 2020

Employment up, market rallies: the takeaway

Employment stunningly rose by 2.5 million in May and the jobless rate declined to 13.3%, according to data Friday from the Labor Department that was far better than economists had been expecting and indicated that an economic turnaround could be close at hand.

Economists surveyed by Dow Jones had been expecting payrolls to drop by 8.33 million and the unemployment rate to rise to 19.5% from April’s 14.7%. If Wall Street expectations had been accurate, it would have been the worst figure since the Great Depression.

As of 9:00 AM June 5, the DJIA is up about 700 points, and the S&P 500 is up 60+, or 2.5%.

Stocks rally because investors focus on the future... And the latest airline data suggests a slow (but steady) return to normalcy. COVID-19 still has no tangible solution. Trade tensions with China are bubbling. Civil unrest is breaking out nationwide. But the Nasdaq 100 just hit a record high and the S&P 500 is down only 8% from its February peak — all because investors are focused on a more blue-skied financial future.

DJIA YTD. Blue line is 200 day MA.

More from Kelly Evans at CNBC

We just got a way better jobs report than expected for May. The U.S. added--added!--2.5 million jobs last month, versus the decline of 7.5 million that economists were expecting. The unemployment rate fell a point-and-a-half to 13.3%, compared with estimates calling for it to rise to 19% and some forecasts that ranged as high as 26%.

It's a completely unexpected outcome--except that it's not. The stock market has been telling us that the U.S. was rebounding more quickly from the pandemic than anyone expected. But instead of being cheered by the market rebound, people decried "the gap between Wall Street and Main Street," and focused instead on the gloomy forecasts about how much of this job loss researchers think will be permanent and how we're already in another Great Depression.

Thank God we have the stock market. At least it's one source of information that doesn't hail from the group-think halls of academia and think tanks. And yet, surprise, surprise, it's more often than not the target of the professional class's ire. Every campaign cycle, someone is pushing a financial transaction tax to supposedly protect the interests of everybody else against the 1% (and to pay for all the programs their experts will run). I'm starting to wonder if it's not meant to protect themselves against the inconvenient verdicts of the stock market.

After all, guess who has been buying this market since the darkest days of the pandemic? Retail investors. The general public. Here's Axios, on March 30: "As traders around the globe have frantically unloaded positions in recent weeks, so-called mom and pop retail investors have kept level heads and not sold out of stocks." See? "Wall Street" was panicking and selling the rally. Everyday Americans were buying.

And while "retail buying" is typically treated with contempt by the professionals, it turns out the Average Joe was on to something. Here's Forbes, a month ago: "Buyer Beware: Retail Investors Buying USO's Oil ETF." USO closed under $20 that day. Today, it's over $28. Or how about all the retail investors buying airline stocks in April? American has gone from the $9-12 range that month to over $21 today.

We still don't know what the future holds. (Obviously.) But we're showing much more resilience in the early stages of recovery and reopening than was expected, and it was the everyday Americans who believed that during the darkest days of the shutdown. And it was the stock market propelled by Main Street buyers, as opposed to the "experts," telling us that all along the way.

Thursday, June 4, 2020

CARES Act provides protections to VA loan holders

On March 27, 2020, the President signed into law the Coronavirus Aid, Relief, and Economic Security Act, Public Law 116-136. The CARES Act protects borrowers with Federally-backed mortgage loans who are experiencing financial hardship due to the COVID-19 national emergency.

The CARES Act provides multiple protections on your VA-guaranteed loan if you experience financial hardship directly or indirectly caused by the COVID-19 emergency, regardless of your loan’s default status. These protections include:
  • A defined forbearance period of up to 180 days, with the possibility for extending it for another 180 days
  • A foreclosure and eviction moratorium for 60 days starting March 18, 2020
  • Instructions on how mortgage servicers are to report to the credit agencies. For example, borrowers who have requested the COVID-19 Forbearance option are not considered to be delinquent for purposes of credit reporting.
A forbearance is a defined time period of one month or longer during which your mortgage servicer agrees to accept reduced payments or no payments. During a forbearance under the CARES Act, your mortgage will continue to accumulate interest, but not late fees or other penalties.

The payments will still be due on your loan, just not during the forbearance period. A forbearance allows you time to resolve the reason that you can’t pay the regular monthly installment and get back on a regular monthly repayment schedule again.

Forbearance in the CARES Act is broken down into two pieces; an initial period and an additional period. For the initial period, you may notify your mortgage servicer that you are financially affected by the COVID-19 emergency and request up to 180 days of forbearance. You don’t have to use the entire forbearance period if you can resume payments sooner.

For the additional period, you may notify your mortgage servicer that you are still financially affected by the COVID-19 emergency and request up to 180 additional days of forbearance. As with the initial period of forbearance, you don’t have to use the entire period of forbearance if you can resume payments sooner.

You simply need to contact your mortgage servicer and request a forbearance because of financial difficulties due to the COVID-19 national emergency.

For more information, visit the VA benefits page here

Wednesday, June 3, 2020

5G. Was it worth it?

In January 2019 I proposed the following basket of stocks related to 5G as a possible portfolio. 

5G Network Portfolio

The initial total investment was $75,000. The value is currently $102,526, for a profit of $27,524.  If dividends are included, the profit is $30,037, for a total return of 40 percent (annualized to 30 percent annually). 

While some of these stocks are near or at 3-year highs, it might be worth investigating the following stocks in a smaller basket: ADI, DLR, GLW, QCOM, T, VIAV, XLNX. These companies also pay dividends, so a play on that alone would be worth an annual yield of 2.9%. 

Do your due diligence. Happy investing.

Top Five Consumer Cyber Security FAQs

By Equifax Business, technology, environmental and economic changes are a part of life, and they are coming faster all the time. All of thes...