Rating Hospitals and Law School Faculties
Today's NY Times has a story detailing NYC's plan to make publicly available the infection and death rates at city hospitals. As decried in FindLaw columns by DoL blogger Sherry Colb (here and here), infections acquired in hospitals account for about 100,000 deaths a year, a truly staggering figure. Some hospitals have begun to take measures to address the problem, which is mostly a function of inadequate or lack of handwashing by medical staff. In his book Better, Atul Gawande explains that proper hygiene technique is observed in the OR but not in the rest of the hospital, mostly because doctors and nurses pressed for time find it too cumbersome. He describes successful measures to change medical staff behavior.
The NYC effort should complement one of Colb's proposals: By alerting the public to the danger, it will give hospitals incentives to improve hygiene. In this as in many other contexts (e.g., the Toxic Release Inventory, aspects of the more successful state public education reforms on which No Child Left Behind was based), requiring public disclosure of information can be a cheap yet powerful regulatory technique. Unfortunately, the NYC program doesn't (yet?) apply to private and not-for-profit non-state hospitals. Statewide or even federal legislation (valid under the Commerce Clause) imposing a reporting mandate would be extremely valuable here. To be sure, there is a risk of adverse selection and cheating in reporting (e.g., hospitals reporting staph infection deaths as something else) but there are ways of policing those behaviors. A single number---deaths due to hospital-induced infections---would make data very useful.
Meanwhile, Brian Leiter has just released his latest effort to measure the top US law schools by scholarly impact. As Leiter himself notes, it's a bit odd to measure scholarly impact by numer of citations of work. Among other problems he identifies, some works are cited because they make "classic mistakes." Although my home institution does reasonably well by Leiter's measure (5th overall mean citations; 6th median citations), the whole enterprise has the air of a popularity contest. True, it ranks institutions in a way that roughly correlates with my own general sense of their scholarly influence---but that's not really telling us anything, is it? We certainly would expect scholars at top schools to be cited a lot. What would be more useful to know---though just about impossible to quantify---is what scholars are writing terrific work that isn't getting noticed. The problem, I guess, is that "quality of legal scholarship" is a much more elusive concept than "deaths due to hospital-induced infection."
The NYC effort should complement one of Colb's proposals: By alerting the public to the danger, it will give hospitals incentives to improve hygiene. In this as in many other contexts (e.g., the Toxic Release Inventory, aspects of the more successful state public education reforms on which No Child Left Behind was based), requiring public disclosure of information can be a cheap yet powerful regulatory technique. Unfortunately, the NYC program doesn't (yet?) apply to private and not-for-profit non-state hospitals. Statewide or even federal legislation (valid under the Commerce Clause) imposing a reporting mandate would be extremely valuable here. To be sure, there is a risk of adverse selection and cheating in reporting (e.g., hospitals reporting staph infection deaths as something else) but there are ways of policing those behaviors. A single number---deaths due to hospital-induced infections---would make data very useful.
Meanwhile, Brian Leiter has just released his latest effort to measure the top US law schools by scholarly impact. As Leiter himself notes, it's a bit odd to measure scholarly impact by numer of citations of work. Among other problems he identifies, some works are cited because they make "classic mistakes." Although my home institution does reasonably well by Leiter's measure (5th overall mean citations; 6th median citations), the whole enterprise has the air of a popularity contest. True, it ranks institutions in a way that roughly correlates with my own general sense of their scholarly influence---but that's not really telling us anything, is it? We certainly would expect scholars at top schools to be cited a lot. What would be more useful to know---though just about impossible to quantify---is what scholars are writing terrific work that isn't getting noticed. The problem, I guess, is that "quality of legal scholarship" is a much more elusive concept than "deaths due to hospital-induced infection."