Looking for our new site?

Higher Ed Watch

A Blog from New America's Higher Education Initiative

< Back to the Education Policy Program

Congress Reaches Pell Grant Funding Agreement for Fiscal Year 2012

December 13, 2011

Media reports indicate that the House and Senate have reached an agreement on fiscal year 2012 funding for the U.S. Department of Education as part of an omnibus spending bill that covers multiple federal agencies. Many education supporters have been waiting to see how Congress will fund the Pell Grant program for fiscal year 2012 (which will support grants in the 2012-13 academic year) given that the House and the Senate had previously proposed very different plans for the program. Although both chambers proposed maintaining the current maximum grant of $5,550, a House draft would have made nearly a dozen changes to eligibility rules that reduced the cost of the program, while the Senate proposed redirecting money spent on student loan subsidies to Pell Grants.

Click here to go to Ed Money Watch to read the rest of this blogpost...

The 2011 Academic Bowl Championship Series

December 7, 2011
Publication Image


The final college football Bowl Championship Series rankings were announced on Sunday: Alabama and Louisiana State University (LSU) will go head-to-head in this year’s National Championship game come January.

No doubt, watching Alabama try to beat the undefeated LSU team for the second time this season in an SEC vs. SEC match-up has the potential to be great football. The Bowl Championship Series, college football’s ranking system that matches up top Division I teams for a series of annual post-season bowl games, is perennially disliked for the opaque formula it uses to rank which teams are best but loved for the many, high-intensity football games it has matched up over the years.

Still, we shouldn’t turn a blind eye to the sport’s dark side. In the classroom, Division I college football teams often fall short. The fact that a player’s college football career is valued more than his academic career is often accepted as the status quo in Division I college football. But these players put in a lot of work for their teams and are compensated solely with college scholarships—and how much is that scholarship worth if a student athlete doesn’t graduate? Though some will go on to have lucrative pro football careers, most won’t, and the players who aren’t bound for NFL glory would benefit from having college degrees to fall back on.

With these issues in mind, policy researchers at the New America Foundation’s Higher Ed Watch blog have for several years used a formula to rival the Bowl Championship Series’s rankings. The Academic BCS measures how well a team supports the “student” side of its student athletes.

Unlike the BCS’s controversial ranking formula, the Academic BCS transparently compares data on team graduation rates and academic progress rates (an NCAA measure of academic success) to the performance of other teams, as well as to regular students at BCS colleges. The results are a look at how football schools would stack up if academics decided a team’s BCS ranking.

The 2011 Academic Bowl Championship Series Rankings

If academics were central to the Bowl Championship Series, top-ranked LSU would fall to 13th in the rankings. The biggest factor dragging down LSU’s performance is the fact that black players at LSU are a whopping 32 percentage points less likely to graduate from college in 6 years or fewer than their white teammates.

The first-place academic team in the BCS rankings is Penn State. 80 percent of Penn State football players who enroll as freshmen currently graduate from college in 6 years or fewer, a respectable grad rate for any sports team or even a university at large. (If you exclude students who transfer or leave the college to play professionally from the dropout rate, 87 percent of Penn players graduate in 6 years or fewer.)

Additionally, there is no black-white graduation rate gap among players on the university’s football team. It’s disappointing to say that is very rare for Division I football. Stanford, example, does an extremely good job of graduating its football players overall and was ranked #1 in last year’s Academic BCS, but the school fell to fourth this year namely because Stanford has a 21 percent gap between the black and white graduation rates for its players.

In the Bowl Championship Series rankings, Alabama barely edged out Oklahoma State for the second place seat this year, causing a stir around whether the ranking was conducted fairly. Unfortunately for Oklahoma State, Alabama triumphs in both the BCS and its academic counterpart: Alabama ranked fifth in our academic ranking and second in the regular BCS ranking, making Alabama and Stanford the only two teams to place in the top 5 for both football and academics. Oklahoma State fared worse, coming in third in the BCS and 15th in the academic rankings.


Four different calculations are used in the Academic BCS: the football team’s graduation rate relative to the school overall; the difference between black and white graduation rates on the team; the difference between black and white graduation rates at the school overall; and the difference between the graduation rates of black players on the football team and the school’s overall black student population.

We use the standard 4-class average graduation rate in our BCS rankings. This rate does not take into account students that transfer out of a college or leave to play professionally. Using this rate allows us to best compare a football team’s graduation rate to the overall graduation rate of students at that college.

Also important is the pool that our BCS ranking draws from. We use the top 25 teams in the Bowl Championship Series’s final standings when we make our rankings. Thus, the teams in the Academic BCS have displayed prominence on the field and, in the case of the top-ranked teams, in the classroom. A list of this year’s BCS rankings and the rankings from previous years is available here. A more detailed explanation of our formula is available here.

The NCAA has its own measure of academic success, the academic progress rate, which measures whether a team is moving its players towards graduation. With the APR, teams get points for eligibility (having players who have good enough grades to play sports) and for retention (having players who don’t drop out of college). Higher Ed Watch takes the APR into account in our formula, but assigns less weight to it than it does for other measures that we believe are more important.

This year's Academic BCS rankings were published yesterday by TIME.com.

A Rare Bit of Good News for Pell Grants: A Surplus

December 6, 2011

Here’s some good news for Pell Grants. Budget analysts expect the program to run a small surplus in fiscal year 2012. It turns out that Congress overfunded the program ever so slightly over the past few years, and as lawmakers look to finalize fiscal year 2012 funding in the coming days, they are likely to overshoot just a bit on Pell Grant funding. This is particularly good news because for the past few years Congress has done just the opposite—lawmakers have knowingly underfunded the program, throwing fuel on the Pell Grant funding fire.

Because Pell Grants operate like an entitlement, but one for which Congress has to actually appropriate funding (real entitlements don’t need annual appropriations, they are automatically funded), Congress has to estimate how much the program will cost at least a year in advance. That estimate changes throughout the year as new data on student enrollment, income, etc. become available.

Click here to continue reading this post on Ed Money Watch...

What Happens to Higher Education Funding When the Supercommittee Fails?

November 21, 2011

The Joint Select Committee on Deficit Reduction (the supercommittee), which is charged with finding ways to cut the budget deficit over 10 years by at least $1.2 trillion, looks set to miss its deadline. The Budget Control Act of 2011, the law that increased the debt ceiling and created the supercommittee, set November 23rd as the date by which the committee must vote on a deficit cutting bill. With two days to go, no such vote is expected to happen. What might become of federal education programs in the wake of a supercommittee failure?

Today the Washington Post highlighted key K-12 education programs that would see their fiscal year 2013 funding trimmed in the absence of a supercommittee compromise. But the article doesn’t mention federal programs for higher education. Make no mistake, higher education programs, like Pell Grants and student loans, would also be affected by a supercommittee failure.

To read this complete post on Ed Money Watch, click here.

A Fond Farewell

November 18, 2011

Dear Readers,

I regret to report that this is my final post for Higher Ed Watch. I am leaving the New America Foundation to join the higher education team at Education Sector, where, among other things, I will be writing for The Quick and the Ed blog. While I am thrilled to take advantage of this exciting opportunity, it is difficult to leave Higher Ed Watch behind because of all the important work we have done.

When I arrived at New America nearly five years ago, Higher Ed Watch was just getting off the ground -- although it had already started making waves with its hard-hitting coverage of the student loan industry. But the blog really took off in April 2007 when we were the first to uncover and reveal improper payoffs that a student loan company had made to college financial aid directors and to a top U.S. Department of Education official (who was later indicted in this case). The news helped shine a spotlight on the fatal flaws -- and outright corruption -- that characterized the Federal Family Education Loan program.

Over the years, our reporting, analysis, and commentary has helped spur policymakers to overhaul the federal student loan programs, take positive steps to protect the most vulnerable students from predatory lenders and schools, and investigate widespread abuses in the for-profit higher education sector. We have also advocated policies aimed at making college more accessible and affordable without overly burdening taxpayers.

Before signing off, I would like to thank current and former New America employees who have played a pivotal role over the years in the blog’s success, including its founder Michael Dannenberg, Jason Delisle, MaryEllen McGuire, Ben Miller, and Lindsey Luebchow. I have been honored to work with them all.

But mostly I want to thank you, our loyal readers, for sticking with us all these years. It’s been a privilege to write for you.

Exclusive: How the Widening Job Placement Rate Scandal Could Have Been Avoided

November 15, 2011

[Over the last several months, Higher Ed Watch has examined how many for-profit colleges cook the books on the job placement rates they disclose to prospective students and regulators. In prior posts, we have looked at how the manipulation of these rates is a widespread problem throughout the industry; revealed some of the most common tricks of the trade for-profit schools have used to inflate these numbers; showed how accreditors and regulators have been asleep at the switch as these abuses have been occurring; reported on the Obama administration's unsuccessful effort to curb these practices; and examined how the drive to manipulate these rates comes straight from corporate headquarters and not rogue employees (see here and here). Today, we finish up this series by going back in history to see how this all could have been avoided.]

The origins of the widening job placement rate scandal in the for-profit higher education sector go back nearly 20 years. Had the Clinton administration officials who ran the Department of Education at the time heeded the warnings of Congressional investigators, the Government Accountability Office, and its own Inspector General, the abuses that are being unearthed today could have been rooted out long ago.

The story begins in the early 1990s when a Senate oversight committee headed by Sam Nunn (D-GA) conducted an investigation that uncovered widespread fraud and abuse in the for-profit higher education sector. The Nunn Committee revealed that scores of unscrupulous schools were reaping profits from the federal student aid programs by enrolling people straight off the welfare lines and pressuring them to sign up for student loans they had little hope of ever repaying. Many of these individuals were lured into the schools with false promises about the lucrative jobs they would be able to get after attending these institutions.

When it came to assigning blame for the federal government’s failure to stop these schools from ripping off students and taxpayers, the committee found that there was plenty to go around. However, the panel reserved some of its harshest criticism for the accrediting agencies that had failed to weed out these institutions or even to detect that anything was amiss.

In its final report in May 1991, the committee urged the Education Department to work with the accreditors to strengthen their ability to carry out their oversight responsibilities or strip them of their gatekeeping role entirely. As part of this effort, the committee recommended that the Education Department be required to “develop minimum uniform quality assurance standards” that accreditors would use to evaluate for-profit schools -- including establishing a single methodology for calculating job placement rates. According to the report:

The Department should be responsible not only for formulating those standards, but also for developing and carrying out a meaningful review and verification process designed to enforce compliance with those standards. If the Secretary determines that an accrediting body does not or cannot meet these requirements, recognition should be terminated.

In 1992, as part of legislation reauthorizing the Higher Education Act (HEA), Congress followed up on this recommendation by requiring the Education Department to put in place standards it expected accreditors to meet as part of its evaluation process. Lawmakers also required the accrediting bodies have standards in place for judging a school’s “success with respect to student achievement in relation to its mission, including, as appropriate, consideration of course completion, State licensing examination, and job placement rates.”

When the time came for the Clinton administration officials in charge of the Education Department to write the rules for carrying out the Higher Education Act revisions, they took a very narrow reading of these requirements. Noting that Congress had not explicitly mandated the establishment of uniform standards, they gave accreditors wide latitude to develop their own criteria for judging a school’s “success with respect to student achievement” and for verifying the information that schools would provide them. Department officials explained in the preamble to the regulations that they had to stick “closely to the law” to avoid “regulation driven management.”

Both the Government Accountability Office and the Education Department’s own Inspector General Thomas Bloom objected to the final rules. In blistering testimony Bloom delivered to a House Government Oversight subcommittee in 1996, he accused the Department’s leaders of misinterpreting the intent of Congress:

By requiring the Department to ‘set standards’ for evaluating accrediting agencies in specified areas, Congress was directing the Department to put meat on the bare-bones statutory language in order to ensure that the agencies had meaningful, quantifiable, and enforceable standards for their member schools…the Department’s regulations are not what the 1992 HEA amendments contemplate.

The failure of the Department to set standards and require vigorous enforcement would only lead to more fraud and abuse, Bloom argued:

Without enforceable standards, schools that fall short of their own accrediting agency standards -- even in such basic areas as graduation and job placement -- may continue to be accredited and continue to participate in the SFA [student financial aid] programs. Since what you measure you get, without measurement and enforcement of even these basic standards for student achievement, we cannot assure that vocational trade schools in the SFA program will consistently graduate and place the bulk of their student in jobs for which they were trained.

Bloom’s testimony was prescient. As we’ve previously written, the job placement rates that for-profit colleges are required to disclose to prospective students and report to accreditors are fundamentally flawed. The methodology that career colleges use to calculate the rates vary accreditor by accreditor, making them impossible to compare. And because neither accreditors, state regulators, nor the federal government make much of an effort to verify these rates, schools have found them easy to game (see here for some of the most common tricks of the trade).

At Higher Ed Watch, we believe that the Nunn Committee and the Inspector General were right and that federal officials should develop a single, national standard that for-profit colleges would be required to use when calculating their job placement rates. It would be accompanied by a strict regulatory regime that would more closely monitor schools to ensure that these numbers are not rigged.

The Obama administration tried to move forward with such an effort but bungled it. However, as evidence of widespread abuses mount, we believe that policymakers won’t have any other choice but to revisit this issue.

It’s just a shame that all the damage that has been done to unsuspecting students and taxpayers could have been avoided.

When it Comes to Job Placement Rates, It’s All About the Numbers

November 10, 2011

[Today at Higher Ed Watch, we are running the second of two posts looking at the question of who's to blame for job placement rate abuses at for profit colleges. Click here to read the first post, which ran on Wednesday.]

Kathleen Bittel was ready for a change.

For 16 months, Bittel worked as an admissions counselor for Education Management Corporation’s Argosy University. Under constant pressure to meet her enrollment targets, she felt she was doing more harm than good to the lives of the students she admitted. So when the opportunity to transfer to EDMC’s Career Services Department arose, she jumped at the chance.

Becoming a career service adviser in the online division of EDMC’s Art Institute brand meant taking a hefty pay cut. But Bittel was willing to do it because she saw the new job as “an act of penance” for the work she had previously done as a recruiter.  She believed that in her new post she would finally be able to help students achieve their dreams. Her excitement, however, was short lived.

“At first, I found it very rewarding to have the opportunity to get to know and work with the industrious graduates of the Art Institutes who were actively seeking a better life. I felt I could provide valuable assistance in helping students find good jobs in a poor job market,” Bittel said in testimony she delivered at a Senate Health, Education, Labor and Pensions Committee hearing a little more than a year ago. “But that feeling did not last long. I realized it was all about hitting quotas instead of really helping students find meaningful work.”

In a separate letter to the Senate Committee’s chairman Tom Harkin and other panel members, she wrote, “What I found in Career Services was even more deceptive than the recruiting practices.”

Where the Real Fault Lies

As we’ve previously written at Higher Ed Watch, there is growing evidence that a substantial number of for-profit college companies, both big and small, have deliberately misled prospective students and regulators about their record in placing graduates into jobs.  When abuses have been unearthed, the companies invariably blame them on “rogue” employees. But the truth is that at many of these companies’ schools, the drive to inflate job placement rates comes straight from corporate headquarters.

At these institutions, career service advisers are constantly in a frenzy to meet the aggressive placement targets set by the companies’ leaders. They are regularly reminded that their continued employment depends on meeting their quotas. Just as in recruiting, employees who bend the rules to achieve their numbers are richly rewarded. Those who miss their targets have their pay docked and know that their jobs are at risk.

So is it any wonder that these advisers count any and every job their graduates get as a successful placement, even if the former students are flipping burgers at McDonalds or serving coffee at Starbucks because their training didn’t pay off? Or that they include graduates in their rates who worked for as little as a day? Or that some even falsify or fabricate employment records altogether? (See here for some of the most common tricks of the trade.)

Kathleen Bittel is just one former career services adviser. But her experiences show the unrelenting amount of pressure that is placed on employees to pump up these numbers by any means necessary.

Learning the Tricks of the Trade

Soon after Bittel started her new job, a co-worker took her aside to show her a trick of the trade: how to alter graduates’ employment records to ensure that they count as being placed. As an example, the adviser pulled out a document that a recent graduate had submitted showing earnings that were too low to be considered a “successful placement.” The co-worker then tossed the document into the trash and created a new one using average salary data from the website Salary.com for the relevant position.

Bittel says she was outraged and reported the incident to her supervisors. But instead of being disciplined, the staffer received an award shortly thereafter for being a star performer. The message this sent was “abundantly clear,” Bittel told the Senate committee. “Employees who hit their numbers will be rewarded regardless of whether graduates actually succeed, or whether the information entered truly represents the graduates’ circumstance.”

Bittel wanted to perform her job honestly but found it nearly impossible to do so. The company expected career advisers to help place about 86 percent of the online art students they were assigned into jobs related to their field of study within six months of graduating. Because there were only five advisers in her division, Bittel had to work with between 150 and 180 students and graduates during each reporting period. Employees who met their targets received a $3,000 bonus each quarter.

Despite her best efforts, Bittel had trouble meeting her quota, and the pressure placed on her only continued to grow. “I was constantly reminded that my numbers were not as high as they wanted them to be.” One quarter, she fell short by just one tenth of one percent. As a result, “the company docked $500 from my bonus and I was told that I could lose my job if I failed to meet October’s goal.”

The situation “culminated,” she said, when she was abruptly called one day into a meeting with her bosses:

The head of the department interrogated me, asking the same questions over and over. ‘Why were my numbers the lowest on the team, and why did I think that everyone else had the numbers he wanted and not me?’ He demanded that I provide him with a plan on how I intended to meet his number, reminding me that my job was in jeopardy should I fail.

She soon reached “the breaking point of my conscience due to the constant pressure to do things I felt to be morally unethical,” and took a leave of absence from which she never returned.

Pumping Up the Numbers

But why would for-profit college companies go to so much trouble to inflate their job placement rates?

For one thing, state regulators and national accreditation agencies generally require for-profit colleges to place between 60 and 70 percent of their students in jobs in the fields in which they trained to remain eligible to participate in the federal student aid programs.  A failure to meet these thresholds could, in other words, be a death sentence for the schools. As a result, these institutions have a major incentive to do whatever it takes to keep these rates high.

For another, high placement rates are an essential recruiting tool for bringing in students. Most students are attracted to these institutions on the promise that they are going to get a good job that pays well.

But for some for-profit college executives, there is even a greater incentive for inflating these rates -- their bonuses depend on it. This was the case for EDMC, at least in 2006, according to a Securities and Exchange Commission (SEC) filing from that year that Higher Ed Watch has obtained. That document shows that “the short term cash award bonuses” that EDMC’s leader received that year depended on the company achieving a specific placement rate and average salary amount for its graduates. “Performance above or below target is increased or reduced by 4 percentage points for each 1 percent difference between plan and an actual performance,” the filing states.

There is, of course, an argument that basing bonuses on placement rates is beneficial because it gives company executives a buy in to the success of their graduates. But it also gives these officials a major financial incentive to do anything they can to pump up these numbers.

Given Bittel’s experiences, is there any doubt about which of these outcomes really motivates them?

Next week, we will complete our job placement rate series. Stay tuned.

Who’s to Blame for Job Placement Rate Abuses at For-Profit Colleges?

November 9, 2011

[Over the last several months, Higher Ed Watch has examined how many for-profit colleges cook the books on the job placement rates they disclose to prospective students and regulators. In prior posts, we have looked at how the manipulation of these rates is a widespread problem throughout the industry; revealed some of the most common tricks of the trade for-profit schools have used to inflate these numbers; showed how accreditors and regulators have been asleep at the switch as these abuses have been occurring; and reported on the Obama administration's unsuccessful effort to curb these practices. Today, in the first of two posts, we will examine one of the most common excuses for-profit schools make when they are caught inflating these rates.]

When Career Education Corporation executives first revealed in August that they had found that “certain” of the company’s health professional schools had engaged in “improper practices” in calculating their job placement rates, they did what for-profit college leaders almost always do when improprieties are discovered on their campuses: they blamed “rogue” employees.

“The actions of a few people have let down others who work hard and responsibly every day on behalf of our students,” Gary McCullough, Career Education’s chief executive officer, said at the time.

But an internal probe of the company performed by an outside law firm has put the lie to these claims. As Career Education officials revealed last week, the lawyers found that the vast majority of the company’s Health Education and Art & Design schools significantly inflated the 2010-11 job placement rates they have been disclosing to prospective students and were about to report to the Accrediting Council for Independent Colleges and Schools (ACICS). “We uncovered that what were going to be reported as placements in a number of cases and a number of places were not genuine placements,” Steve Lesnik, the company’s chairman said during a conference call with financial analysts. Lesnik took over as the company’s CEO last week after McCullough abruptly resigned from his post.

Syndicate content