Consortium News

  • 26 May 2015 4:53 PM | Deleted user

    retrieved from  |  May 26, 2015
    Read original article.

    With major data breaches making headline news on a near-weekly basis, the healthcare industry is increasingly focused on cybersecurity, as well it should be.

    In addition to working toward a culture of security wherein all employees are trained to spot and prevent attempted cyber-attacks, a strong program is an essential part of any long-term business strategy, and healthcare entities are not exempt. 

    But where to begin? And how best to move forward?

    These 10 tips can help healthcare organizations establish such a program:

    1. Create a strong, cross-sectional cybersecurity team that includes personnel from legal, information technology, human resources, and public relations departments. The team should also include at least one member of senior management.
    2. Conduct a “privacy survey,” which is the process of identifying the legal, regulatory, and contractual obligations to protect data. Healthcare companies and their business associates must be particularly aware of their obligations to safeguard protected health information under both HIPAA and HITECH. Companies should also consider state laws to protect “personally identifiable information” (“PII”), and should understand contractual obligations, which likely include obligations to protect payment card information (“PCI”) under the rules established by card brands like Visa and MasterCard.
    3. Perform risk analysis required under HIPAA’s Security Rule. As part of the risk analysis process, companies need to identify the PHI they maintain and develop a detailed understanding of their technical systems and the potential threats they may face.
    4. Segregate sensitive data from regular data and protect it with additional physical, technical, or procedural safeguards (including firewalls, password protection and encryption). 
    5. Implement “privacy by design” when developing cybersecurity solutions. The company should create policies and procedures that account for patient privacy, legal compliance, and data protection throughout the data lifecycle (i.e., collection, processing, storage, and destruction). As part of this effort, the company should develop comprehensive policies to address privacy and data security, including:

      a BYOD policy, a password policy requiring use of strong, complex, unique passwords; personnel policies (including onboarding and off-boarding policies) that enhance security; and a network tracking policy requiring regular monitoring of network traffic for evidence of suspicious access.
    6. Manage vendors and scrutinize the adequacy of their cybersecurity policies and procedures before entering into relationships with them. Enact contractual safeguards to minimize risk, including by requiring protection of sensitive data, providing rights to audit vendors’ security practices, and requiring vendors to notify the company if a breach occurs. The contract should allocate risk in the event that a breach at the vendor harms the company, and companies should consider requiring vendors to carry cyber insurance. Companies must enter business associate agreements with vendors that will have access to PHI.  But before entering a business associate agreement, healthcare organizations should assess whether a vendor’s access to PHI is necessary. If not, the vendor should not have access to the PHI, and the company may avoid the compliance costs associated with business associates.
    7. Engage in cybersecurity information sharing through, for example, the National Health ISAC. The NH-ISAC allows industry players to keep abreast of evolving cyber-attack tactics and industry security standards.
    8. Consider cybersecurity insurance, which, depending on the policy, may cover forensic investigation and system restoration costs; defense and indemnity costs associated with litigation resulting from the loss of personal information or other sensitive data; regulatory investigation defense costs and penalties; notification costs and credit monitoring for affected customers and employees; losses attributable to the theft of the policyholder-company’s own data (including transfer of funds);  business interruption costs; costs required to investigate threats of cyber-extortion and payments to extortionists; and (viii) crisis management costs, such as the hiring of public relations firms. Unlike many traditional policies, cyber liability policies differ significantly because they are not (yet) based on a standard form. It is therefore critical to carefully review the exclusions of cyber policies with a broker and coverage counsel. 
    9. Develop an incident response plan, which is a detailed plan that outlines how a company will respond to suspected cyber-events. These plans help companies quickly and effectively investigate and remediate attacks. An incident response plan should identify the leaders of the response team and present easy-to-follow, scenario-based responses to different types of cyber incidents. The plan should clearly delineate first steps and include a timeline of major investigative events. The plan should also provide for involvement of experienced legal counsel in all aspects of the investigation of a suspected cyber-event (including communications about the potential event, remediation efforts, and disclosure and reporting) to ensure that the investigation is protected under the attorney-client and work product privileges. Privilege is critical because the company may soon find itself the defendant in a variety of lawsuits, including lawsuits by regulators, customers, issuing banks, or investors.  
    10. Develop a business continuity plan to facilitate efficient data recovery and resumption of operations in compliance with HIPAA requirements. Cyber-attacks may result in victim-companies losing access to their data and systems. For example, many companies have been affected by the Cryptolocker malware, which encrypts (and renders useless) the company’s data until a ransom is paid.  If companies are not prepared for these types of attacks, they may face enforcement actions, private litigation, and a substantial interruption of services, which can each be extremely costly. The first step in creating an effective business continuity plan is identifying critical systems. Systems should be prioritized based on the maximum time that each can be down without causing substantial harm to the business. The company must then select a back-up system, and should consider the following factors in choosing a back-up system: how quickly the data needs to be restored, how much data must be stored, and how long data must be maintained. It is critical that the company’s back-up system be sufficiently segregated from the company’s day-to-day systems so that a cyber-attacker cannot access the back-up system during an attack.

    Emily Westridge Black is an attorney in the Austin office of Haynes and Boone, LLP, and Chris Quinlan is an attorney in the Dallas office. Both specialize in data security, white-collar criminal defense, and the prosecution and defense of complex commercial litigation matters.

  • 26 May 2015 4:44 PM | Deleted user

    retrieved from  |  By ROBERT PEAR  |  MAY 26, 2015

    WASHINGTON — Since President Obama took office, the federal government has poured more than $29 billion into health information technology and told doctors and hospitals to use electronic medical records or face financial penalties.

    But some tech companies, hospitals and laboratories are intentionally blocking the electronic exchange of health information because they fear that they will lose business if they share information on patients with competing providers, administration officials said. In addition, officials said, some sellers of health information technology try to “lock in” customers by making it difficult for them to switch to competing vendors.

    “We have received many complaints of information blocking,” said Dr. Karen B. DeSalvo, the national coordinator for health information technology. “We are becoming increasingly concerned about these practices.”

    The White House and Congress are looking for ways to ensure a freer flow of information.

    Mr. Obama has made computerizing patients’ medical records a priority of his administration. Four weeks after taking office, he signed an economic stimulus bill that offered bonus payments to doctors and hospitals adopting the new technology. He said that it would save money by reducing waste and duplication, and that it could save lives by improving care.

    Many doctors and hospitals have begun using electronic medical records, but providers with different systems are often unable to share data in electronic form.

    “We have electronic records at our clinic, but the hospital, which I can see from my window, has a separate system from a different vendor,” said Dr. Reid B. Blackwelder of Kingsport, Tenn., chairman of the American Academy of Family Physicians. “The two don’t communicate. When I admit patients to the hospital, I have to print out my notes and send a copy to the hospital so they can be incorporated into the hospital’s electronic records.”

    Dr. Peter E. Masucci, a pediatrician in Everett, Mass., said he had been trying for more than five years to connect his electronic medical records with those of a hospital where he often sends patients. “It’s never happened,” he said.

    In a report to Congress, Dr. DeSalvo gave several other examples:

    • A doctor may have to pay exorbitant fees to transfer information on patients from the medical record system of one vendor to that of another.
    • A network of doctors and hospitals refuses to share electronic information on patients with providers outside the network, even when the information is needed to treat patients and sharing the data is allowed by federal rules.
    • Doctors obtain laboratory services and health records technology from one company, which will not let them connect to a competing laboratory, even though the connection is technically feasible and the doctors are willing to pay for it.

    The Obama administration said that some hospitals and technology companies appeared to be using electronic health records as part of a business strategy to “enhance their market dominance.” But Dr. DeSalvo said it was difficult to discover the details of such arrangements because some information technology companies “prohibit customers from reporting or even discussing” terms of their contracts.

    Dr. DeSalvo’s office is working with the Federal Trade Commission, which enforces antitrust and consumer protection laws. Tara Isa Koslov, a lawyer at the commission, said her agency was “paying close attention to developments in health I.T. markets and the reports of information blocking.”

    Peter J. DeVault, a vice president of the Epic Systems Corporation in Verona, Wis., one of the largest suppliers of software for electronic health records, said: “We do not participate in any activities that could be described as information blocking. To our knowledge, these activities are very rare, if they exist at all.”

    But Dan Haley, vice president of a competing company, Athenahealth, based in Watertown, Mass., said: “This is a serious problem. Some health I.T. vendors have business models that create real impediments to the sharing of medical information.”

    Under a typical arrangement, Mr. Haley said, a vendor tells hospitals, “It will cost you $1 million to build a connection to another system, we will charge $500,000 a year to maintain that connection, and on top of that we’ll charge $2 each time we send a patient’s records to hospitals that use a different system.”

    Congress took a first step to address the problem in a bill signed into law last month. It said that doctors and hospitals must not deliberately block the sharing of information if they receive federal bonus payments for using electronic health records.

    A separate bill approved last week by a House committee defines “information blocking” as a federal offense. Doctors, hospitals and technology vendors could be punished with civil fines of up to $10,000 for each violation.

    “Since 2009, we have spent $29 billion to encourage the adoption of electronic health records, but the data is still fragmented,” said Representative Michael C. Burgess, Republican of Texas. “Vendors continue to impede the exchange of information.”

    Even within a hospital, communications can be a challenge.

    William F. Barrow II, president of Our Lady of Lourdes Regional Medical Center in Lafayette, La., said his 186-bed hospital had two systems of electronic health records — one from the Cerner Corporation for inpatients, another from Epic for outpatients.

    “It’s hard to get the two to work optimally together,” Mr. Barrow said.

    Amalia R. Miller, an economist at the University of Virginia who has studied the industry, said: “Even when hospitals have the technological capability, they do not always share information with hospitals outside their system. If information is portable, patients can switch providers more easily, and that could hurt the business of some hospitals.”

    Thus, Ms. Miller said, federal money meant to foster an interconnected web of health care providers may inadvertently have subsidized the creation of “information silos,” making it more difficult to share data and coordinate care for consumers.

  • 26 May 2015 2:23 PM | Deleted user

    retrieved from   |   May 26, 2015

    CAMBRIDGE, Mass.--(BUSINESS WIRE)--PatientsLikeMe and Partners HealthCare announced today that they are working together to give Partners HealthCare patients access to tools and information that can help improve decision making with their clinical teams and enhance health outcomes.

    PatientsLikeMe Executive Vice President of Marketing and Patient Advocacy Michael Evers said the agreement is the first to provide access to the website from within a provider’s patient portal. 

    “We’re excited to work with such an esteemed health system to help patients and their care teams have a more complete understanding of patients’ whole health experience, and to support shared decision making about next steps.”

    Partners Population Health Management Associate Medical Director Adam Licurse, MD, MHS, who is a leader in population health patient engagement efforts at Partners, added the agreement is a key building block towards the healthcare system’s vision for better involving patients in their care. 

    “We know that as patients become more engaged in their care, they have better care experiences, make more informed decisions based on their goals, and in some cases can actually receive higher value care at the end of the day. Peer mentorship, patient self-management, and patient education are all critical pieces to that puzzle. We believe PatientsLikeMe’s online patient community provides a meaningful solution to help meet these needs.”

    As part of the agreement, the organizations:

    Have provided access to PatientsLikeMe from Partners Patient Gateway, an online tool for patients to learn more about their condition and communicate with their doctors’ offices.

    Will introduce a new “PatientsLikeMe 101” training series to guide Partners HealthCare clinical teams in helping patients and caregivers get the most out of PatientsLikeMe’s tools and support network.

    Plan to include patients’ use of the website and its tools at the point of care in select Partners HealthCare clinical care sites and practices. Partners HealthCare clinicians are currently outlining several projects designed to understand how the use of patient-generated health data at the point of care can impact health outcomes, patient engagement, patient empowerment, care coordination and patient satisfaction. The projects are expected to kick off this year.

  • 26 May 2015 11:58 AM | Deleted user

    retrieved from Life as a Healthcare CIO - Dr. John Halamka
    WEDNESDAY, MAY 20, 2015  |  

    21st Century Cures Act

    I’m in Washington today for the HIT Standards Committee and I will post the usual summary of the meeting this evening.  However, I wanted to post a morning preview of the opening comments I’ll make a the meeting.

    We are in a time of great turmoil in healthcare IT policy making.   We have the CMS and ONC Notices of Proposed Rulemaking for Meaningful Use Stage 3, both of which need to be radically pared down.   We have the Burgess Bill which attempts to fix interoperability with the blunt instrument of legislation.  Most importantly we have the 21st Century Cures Act, which few want to publicly criticize.   I’m happy to serve as the lightening rod for this discussion, pointing out the assumptions that are unlikely to be helpful and most likely to be hurtful.

    The interoperability language to be included in the 21st Century Cures Act would sunset the Health IT Standards Committee while a new “charter organization” would help define the standards of interoperability.

    Under the latest language, which was revised over the weekend and yesterday, electronic records must meet those interoperability standards by Jan. 1, 2018 and face being decertified by Jan. 1, 2019 if they don’t. 

    The bill would also require EHR vendors to publish their application program interfaces. Vendors must also publish fees to “purchase, license, implement, maintain, upgrade, use or otherwise enable and support” their products.

    There is no provision mentioning the sharing of substance abuse treatment records, which Rep. Tim Murphy said last week he was working to include. Congressional staffers said the version the House Energy and Commerce Committee marks up Wednesday may still be changed before a floor vote, which is expected sometime next month.

    It does not make sense to officially sanction a “charter organization” and seed it with $10 million, creating yet another player in an already crowded field of groups working on interoperability.  I agree that coordinating the standards development organizations makes a lot of sense -- why not just direct ONC to create a permanent Task Force that reports to the HIT Standards Committee, and let ONC support it out of existing resources?

    The drafts have other significant issues

    “standards to measure interoperability” –  I have no idea what that means.  I suggest that ONC create and report outcome measures that require interoperability rather than trying to measure the process of interoperability.  With Meaningful Use Stage 2 we experienced the failure of process measures to truly measure interoperability - transitions of care sent from and to the same organizations, transitions of care sent to an outside organization then thrown away.

    “information blocking” - I believe this concept is like the Loch Ness Monster, often described but rarely seen.   As written, the information blocking language will result in some vendors lobbying in new political forums (Federal Trade Commission and Inspector General) to investigate every instance where they are getting beaten in the market by other vendors.  The criteria are not objective and will be unenforceable except in the most egregious cases, which none of us have ever experienced.

    “De-certification” makes no sense.  Every provider would have to be granted a hardship exemption, so what is the point of the decertification?

    So, how do we accelerate interoperability?

    1. Make Meaningful Use and certification more manageable by narrowing its scope but tightening its enforcement.  Encourage and expand value-based purchasing initiatives and sunset Meaningful Use as quickly as possible. Meaningful Use/certification should be used to lay the foundation ecosystem, but value-based purchasing is what will transform health care.

    2. ONC should  focus/narrow the number of projects it is executing simultaneously

    3. The Role of ONC should be  certification, safety, alignment of Federal agencies, making available data to support nationwide interoperability (such as NPPES/PECOS data for provider directories), and creating transparency by disseminating market information. 

    4. Aggressively clean up privacy/security heterogeneity.   We need to get alignment of state laws.  We need to remove barriers to patient identity management.  We need to get rid of arcane Federal laws such as CFR 42 Part 2.  This will require bold leadership and a significant effort.  It won’t lead to political career advancement, but interoperability will be enabled and it will improve outcomes for patients over the medium- to long-term.

    5. We need a full time leader at ONC once Karen DeSalvo is confirmed as Assistant Secretary of Health

    Throughout my life, I have tried to be a neutral convener without an agenda.   I hope the industry realizes that my observations above are not meant to be emotional or dogmatic.   My only self-interest is to make a difference and prevent poor legislation and regulation from doing harm.

    Visit Dr. Halmka's Blog

  • 21 May 2015 4:25 PM | Deleted user

    retrieved from Boston Business Journal  
    May 21, 2015  |  Jessica Bartlett

    Children’s has acquired its first ever primary care physician group – and it’s not in Massachusetts.

    The pediatric institution is finalizing a deal to acquire Children’s & Women’s Physicians of Westchester (CWPW), a group of more than 276 physicians across 57 locations in New York, Connecticut and New Jersey.

    Executives wouldn’t disclose the terms of the deal, but said they expect it to be finalized by July.

    Dr. Kevin Churchwell, executive vice president of health affairs and chief operating officer at Children’s, details why Children’s sought out connections out of state, and what the changes could be long-term.

    How did this relationship come about?

    This has been over a year of discussion, to be honest with you. It started as a mutual (conversation as) to what the possibilities were or could be between the two groups, knowing we had a history of referrals from the New York area and had relationships with CWPW in Westchester hospital system. The more we talked the more we realized there was a significant amount of synergies of our belief, values, commitment to patients and recognizing how the landscape of healthcare would change. There would be a great relationship of us working together.

    How will this affiliation work?

    They will be part of our community of care as we continue to develop our northeastern pediatric network. They will have their appointments with the New York medical college and their medical staff for Maria Fareri Children’s Hospital at Westchester Medical Center - and that won’t change. They will also be part of our continuum. We will have the intricacies of their board and decision-making we’re working through, but they will be part of Children’s from that standpoint. They will have local administrative oversight there, with Children’s being the parent and helping support that. It’s new for us so we will continue to work through these intricacies as we develop it over the next year or two years.

    You mentioned you’ve already had referrals from New York, so why is this advantageous to Children’s?

    It’s more about patients and the family. We recognize that Children’s is a local, regional, national, international destination for patients with special problems. …they also recognize that there is gap in the continuum of care in patients coming to us and being sent back and how the continuum is lacking and impacts the quality of care we need to provide…

    We’ll have constant communication, the ability to have an impact of care quality at the local level, from resources but also just the protocols that we have and that they have…especially with those children with tertiary (and higher) care needs. That’s important and the future of medicine…expecting that those children will be at Children’s and will have an easy referral to us and that we can refer back and make sure care remains local.

    Will this change CWPW physician rates?

    That’s a great question. Rates continue to change. We expect the rates won’t change overnight but they will be most likely some adaptation with our involvement that we will see, that will be impacted by the landscape but by different legislation and the needs of patients and payers. We will see how that’s going to evolve.

    How many affiliated physicians does Children’s currently have?

    Within children’s we have our departments that are part of the Boston children’s hospital enterprise. That’s a lot of physicians, over 1,000 (approximately 1,300) – specialists in our department and in our permanent care practice that is children’s based.

    We don’t own a primary care practice outside of Children’s. We have alliances within New England, and that is through the Pediatric Physicians' Organization at Children's Hospital Boston (a 300-physician group of doctors at more than 75 practices throughout Eastern Massachusetts) – but we don’t own those practices.

    We will purchase the CWPW practice.

  • 15 May 2015 4:21 PM | Deleted user

    Retrieved from
    May 15, 2015 by Rajiv Leventhal

    Another bill regarding ICD-10 has been introduced into the U.S. House of Representatives. Rather than call for the new coding set to be prohibited like the most recent bill did, this one pushes for a required ICD-10 transition period following implementation on October 1.

    This bill, H.R. 2247, the Increasing Clarity for Doctors by Transitioning Effectively Now Act (ICD-TEN Act), would “require the Secretary of Health and Human Services (HHS) to provide for transparent testing to assess the transition under the Medicare fee-for-service claims processing system from the ICD-9 to the ICD-10 standard, and for other purposes,” according to a blog post by the Journal of AHIMA (the American Health Information Management Association).

    The bill, introduced on May 12 by Rep. Diane Black (R-TN), would not halt or delay the Oct.1, 2015 implementation deadline for using ICD-10-CM/PCS, nor would it require the Centers for Medicare and Medicaid Services (CMS) to accept dual coding—claims coded in either ICD-9 or ICD-10. However, the bill would require HHS to conduct “comprehensive, end-to-end testing” to assess whether the Medicare fee-for-service claims processing system based on the ICD-10 standard is fully functioning. HHS would be required to make the end-to-end testing process available to all providers of services and suppliers participating in the Medicare fee-for-service program, according to AHIMA.

    Not later than 30 days after the date of completion of the end-to-end testing process the HHS Secretary would be required to submit to Congress a certification on whether or not the Medicare fee-for-service claims processing system based on the ICD-10 standard is fully functioning.

    HHS would need to prove that it is processing and approving at least as many claims as it did in the previous year using ICD-9. If the transition is not deemed “functional” based on this benchmark, HHS would need to identify additional steps that it would take to ensure ICD-10 is fully operational in the near future, according to the bill.

    During an 18-month transition period and any ensuing extensions, no reimbursement claim submitted to Medicare could be denied due solely to the “use of an unspecified or inaccurate subcode,” according to the bill.

    “In the past, Congress has repeatedly delayed the switch from the ICD-9 coding system to the far more complex ICD-10 system out of concern about the effect on providers. Neither Congress nor the provider community support kicking the can down the road and supporting another delay, but we must ensure the transition does not unfairly cause burdens and risks to our providers, especially those serving Medicare patients,” Black wrote in a letter urging fellow legislators to cosponsor the ICD-TEN Act. “During the ICD-10 transitional period, it is essential for CMS to ensure a fully functioning payment system and institute safeguards that prevent physicians and hospitals from being unfairly penalized due to coding errors.”

    The most recent ICD-10 bill, H.R. 2126, introduced by Rep. Ted Poe (R-TX) on April 30, would “prohibit the Secretary of Health and Human Services from replacing ICD-9 with ICD-10 in implementing the HIPAA code set.” Soon after that bill was introduced, AHIMA predicted that it could face difficulty getting through the committee process and to the House floor for a vote.

    Similarly, AHIMA is against this bill as well, as it says ICD-10 contingency plans already supported by CMS have been put in place and are working well. H.R. 2247’s proposed 18-month grace period on coding, where nearly all claims would be accepted, would “create an environment that’s ripe for fraud and abuse,” said Margarita Valdez, senior director of congressional relations at AHIMA.

  • 14 May 2015 10:55 AM | Deleted user

    Retrieved from New England Journal of Medicine
    May 14, 2015  |  Austin B. Frakt, Ph.D., and Nicholas Bagley, J.D.

    What if it were impossible to closely study a disease affecting 1 in 11 Americans over 11 years of age — a disease that's associated with more than 60,000 deaths in the United States each year, that tears families apart, and that costs society hundreds of billions of dollars?1 What if the affected population included vulnerable and underserved patients and those more likely than most Americans to have costly and deadly communicable diseases, including HIV–AIDS? What if we could not thoroughly evaluate policies designed to reduce costs or improve care for such patients?

    These questions are not rhetorical. In an unannounced break with long-standing practice, the Centers for Medicare and Medicaid Services (CMS) began in late 2013 to withhold from research data sets any Medicare or Medicaid claim with a substance-use–disorder diagnosis or related procedure code. This move — the result of privacy-protection concerns — affects about 4.5% of inpatient Medicare claims and about 8% of inpatient Medicaid claims from key research files (see table),


    impeding a wide range of research evaluating policies and practices intended to improve care for patients with substance-use disorders.

    The timing could not be worse. Just as states and federal agencies are implementing policies to address epidemic opioid abuse and coincident with the arrival of new and costly drugs for hepatitis C — a disease that disproportionately affects drug users — we are flying blind.

    The affected data sources include Medicare and Medicaid Research Identifiable Files, which contain beneficiary ZIP Codes, dates of birth and death, and in some cases Social Security numbers. For tasks common to most health services research — such as combining patient-level data across systems (e.g., Medicare, Medicaid, and the Veterans Health Administration [VHA]), associating them with community or market factors (e.g., provider density or type of health insurance plans available), or studying mortality as an outcome — these are essential variables.

    For decades, CMS has released data on claims related to substance-use disorders to allow researchers to study health systems and medical practice. One early example of such work is a study based on 1991 Medicare claims data that showed that few elderly patients received follow-up outpatient mental health care after being discharged with a substance-use–disorder diagnosis. Patients who received prompt follow-up care were less likely to die, a finding that could not have been obtained without information on patients' precise date of death.2 More recently, a 2010 study used 2003–2004 Medicare claims data linked by Social Security number to records from the VHA to assess the extent to which patients with substance-use disorders relied on the VHA for care.3 Substance-use disorders are among the diagnoses that have been included in the Dartmouth Atlas analyses of geographic variation in Medicare spending — which rely on ZIP Code identifiers — going back to at least 1998. To our knowledge, no patients have been harmed because of data breaches associated with studies such as these.

    CMS has justified the data suppression by pointing to privacy regulations that prescribe the stringent conditions under which information related to the treatment of substance-use disorders may be shared.4 These regulations, which are overseen by the Substance Abuse and Mental Health Services Administration (SAMHSA), already frustrate accountable care organizations and health-information exchanges, since their elaborate consent requirements make it difficult or impossible to share patient data related to substance-use disorders. As a result, many organizations exclude such information from their systems, undercutting efforts to improve care and efficiency.

    For researchers, the problem is more acute. Although the privacy regulations authorize providers to disclose data on substance-use disorders for research purposes, they prohibit third-party payers — including CMS — from doing so. In 1976, when the regulations were first adopted, this prohibition was not a substantial impediment to research. Before computers came into widespread use, researchers could not look to insurers or CMS to provide large claims-based data sets. Even if they could, crunching those data would have been exceedingly difficult.

    But the world has changed. Access to reliable Medicare and Medicaid data has long offered researchers a window into U.S health care.2,3 Indeed, given the unwillingness of private insurers to share their data, Medicare and Medicaid data often provide our only way of gathering information about medical practice, patient outcomes, and costs. The very importance of the data may explain why CMS has long overlooked the prohibition on disclosure.

    In 2013, however, SAMHSA advised CMS that the privacy regulations require suppression of claims related to substance-use disorders. The agency's sudden insistence on this point is puzzling. The law that the privacy regulations are intended to implement states that identifiable data on substance-use disorders “may be disclosed,” even without patient consent, “to qualified personnel for the purpose of conducting scientific research.” Banning CMS from sharing such data with researchers is difficult to square with that statutory exemption.

    Nonetheless, in November 2013, CMS began scrubbing Medicare data of claims related to substance-use disorders. It did the same for Medicaid data in early 2014. No notice was given to the research community about the policy change. Most of our colleagues have been shocked to learn of it; many others probably remain unaware of the change.

    The suppression has skewed Medicaid data more than Medicare data, a disparity that reflects differences between the populations served by the two programs (see table, and the Supplementary Appendix, available with the full text of this article at In both programs, inpatient claims are much more likely to be affected than outpatient claims.

    In the vast majority of cases, claims are suppressed because the patients have secondary diagnoses of substance-use disorders. That raises an additional concern: many of the withheld data pertain to admissions for services that address not substance-use disorders but rather conditions that may be exacerbated by substance abuse. In other words, the data suppression extends well beyond its intended domain.

    The effects of the CMS actions are thus much broader than they might initially seem. Clearly, it is now infeasible to conduct any study of patients with substance-use disorders based on Research Identifiable Files. But studies of conditions disproportionately affecting such patients — such as hepatitis C or HIV — will also be hampered. Moreover, any study relying on those files cannot make full diagnosis-based risk adjustments that include substance-use–disorder diagnoses. And because the data have been altered in a systematic, nonrandom manner — with suppression affecting different populations, age groups, regions, and providers to different degrees — the results of many studies that have no apparent connection to substance use will be biased.

    And to what end? Without question, protecting patient confidentiality is essential, especially when it comes to potentially stigmatizing diagnoses and treatments. But there is no evidence that researchers — who, under current rules, must adhere to strict data-protection protocols, backed by criminal penalties — cannot appropriately secure research data. And most Americans want their health data to be available for research.5 At the same time, data suppression and access limitations remove from scrutiny a great deal of taxpayer-financed care.

    We believe that the federal government's short-sighted policy will harm the very people it was meant to protect. We encourage SAMHSA and CMS, in dialogue with researchers and providers, to restore access to data that are necessary to improving care for patients with substance-use disorders.

  • 13 May 2015 3:19 PM | Deleted user

    Misunderstanding these important tools can put your company at risk – and cost you a lot of money

    Retrieved from  |  May 13, 2015

    You’ve just deployed an ecommerce site for your small business or developed the next hot iPhone MMORGP. Now what?

    Don’t get hacked!

    An often overlooked, but very important process in the development of any Internet-facing service is testing it for vulnerabilities, knowing if those vulnerabilities are actually exploitable in your particular environment and, lastly, knowing what the risks of those vulnerabilities are to your firm or product launch. These three different processes are known as a vulnerability assessment, penetration test and a risk analysis. Knowing the difference is critical when hiring an outside firm to test the security of your infrastructure or a particular component of your network.

     Let’s examine the differences in depth and see how they complement each other.

    Vulnerability assessment

    Vulnerability assessments are most often confused with penetration tests and often used interchangeably, but they are worlds apart.

    Vulnerability assessments are performed by using an off-the-shelf software package, such as Nessus or OpenVas to scan an IP address or range of IP addresses for known vulnerabilities. For example, the software has signatures for the Heartbleed bug or missing Apache web server patches and will alert if found. The software then produces a report that lists out found vulnerabilities and (depending on the software and options selected) will give an indication of the severity of the vulnerability and basic remediation steps.

    It’s important to keep in mind that these scanners use a list of known vulnerabilities, meaning they are already known to the security community, hackers and the software vendors. There are vulnerabilities that are unknown to the public at large and these scanners will not find them.

    Penetration test

    Many “professional penetration testers” will actually just run a vulnerability scan, package up the report in a nice, pretty bow and call it a day. Nope – this is only a first step in a penetration test. A good penetration tester takes the output of a network scan or a vulnerability assessment and takes it to 11 – they probe an open port and see what can be exploited.

    For example, let’s say a website is vulnerable to Heartbleed. Many websites still are. It’s one thing to run a scan and say “you are vulnerable to Heartbleed” and a completely different thing to exploit the bug and discover the depth of the problem and find out exactly what type of information could be revealed if it was exploited. This is the main difference – the website or service is actually being penetrated, just like a hacker would do.

    Similar to a vulnerability scan, the results are usually ranked by severity and exploitability with remediation steps provided.

    Penetration tests can be performed using automated tools, such as Metasploit, but veteran testers will write their own exploits from scratch.

    Risk analysis

    A risk analysis is often confused with the previous two terms, but it is also a very different animal. A risk analysis doesn't require any scanning tools or applications – it’s a discipline that analyzes a specific vulnerability (such as a line item from a penetration test) and attempts to ascertain the risk – including financial, reputational, business continuity, regulatory and others -  to the company if the vulnerability were to be exploited.

    Many factors are considered when performing a risk analysis: asset, vulnerability, threat and impact to the company. An example of this would be an analyst trying to find the risk to the company of a server that is vulnerable to Heartbleed.

    The analyst would first look at the vulnerable server, where it is on the network infrastructure and the type of data it stores. A server sitting on an internal network without outside connectivity, storing no data but vulnerable to Heartbleed has a much different risk posture than a customer-facing web server that stores credit card data and is also vulnerable to Heartbleed. A vulnerability scan does not make these distinctions. Next, the analyst examines threats that are likely to exploit the vulnerability, such as organized crime or insiders, and builds a profile of capabilities, motivations and objectives. Last, the impact to the company is ascertained – specifically, what bad thing would happen to the firm if an organized crime ring exploited Heartbleed and acquired cardholder data?

    A risk analysis, when completed, will have a final risk rating with mitigating controls that can further reduce the risk. Business managers can then take the risk statement and mitigating controls and decide whether or not to implement them.

    The three different concepts explained here are not exclusive of each other, but rather complement each other. In many information security programs, vulnerability assessments are the first step – they are used to perform wide sweeps of a network to find missing patches or misconfigured software. From there, one can either perform a penetration test to see how exploitable the vulnerability is or a risk analysis to ascertain the cost/benefit of fixing the vulnerability. Of course, you don’t need either to perform a risk analysis. Risk can be determined anywhere a threat and an asset is present. It can be data center in a hurricane zone or confidential papers sitting in a wastebasket.

    It’s important to know the difference – each are significant in their own way and have vastly different purposes and outcomes. Make sure any company you hire to perform these services also knows the difference.

  • 12 May 2015 9:52 AM | Deleted user

    retrieved from  
    May 11, 2015 | Mark Fulford, Partner, LBMC

    In March of 2014, the Office for Civil Rights (OCR) announced that HIPAA audits would start in the fall of 2014. To date, no audits have taken place, and as of this writing, the audit program is still on hold. That said, the OCR is gearing up for the pre-selection process and has announced that audits will commence when the audit portals and project management software are completed.

    Like the start-date, the exact number and types (desk vs. on-site) of audits has been in a state of flux. All indicators, however, point to significantly more than the 115 that were selected as part of the pilot audit program of 2011/2012. Participants will include health plans, healthcare providers and clearinghouses (covered entities), and in a second round, a cross section of business associates.

    For some healthcare organizations, submitting to an OCR audit will be challenging at best. The HIPAA audit pilot program revealed an egregious lack of attention to HIPAA rules and regulations across the industry.  As a result, the 2015 OCR Audit participants can expect a particular focus on areas that had the most significant observations and findings in 2012: lack of risk assessments; attention to media movement and disposal; and institution of audit controls and monitoring. 

    But even if an entity has been reasonably attentive to compliance, it still behooves them to do some upfront research on what to expect should they be selected.

    How to respond

    The OCR has not been particularly forthcoming with information on the upcoming audits, so it’s up to individual organizations to interpret what to expect and how to prepare. But the OCR has indicated that — unlike the 2012 pilot program — the audits will be conducted by OCR personnel rather than by a third party. And unlike last time, the audits will lean more heavily toward desk audits, with onsite audits occurring on a case-by-case basis.

    According to information in presentations from Department of Health and Human personnel, here is what audited entities need to be aware of:

    1. Data request will specify content and file organization, file names and any other document submission requirements.
    2. Only requested data submitted on time will be assessed.
    3. All documentation must be current as of the date of the request.
    4. Auditors will not have opportunity to contact the entity for clarification or to ask for additional information, so it is critical that the documents accurately reflect the program.
    5. Submitting extraneous information may increase difficulty for auditor to find and assess the required items.
    6. Failure to submit response to requests may lead to referral for regional compliance review.

    Document submissions will be no small task, so gathering necessary evidence up front will minimize disruption to day-to-day operations.

    Getting ahead of OCR

    Once an organization receives notification, it should immediately mobilize. If subsequently chosen to submit to an audit, participants will only have a short time to respond. The following provides basic steps for a strategic OCR Audit plan:

    Gather a team. Privacy and security officials should be assigned to a task force responsible for handling audit requests. It’s also a good idea to notify internal or external legal counsel to keep them on stand-by should guidance be necessary.

    Follow guidelines on how to respond. The OCR will provide specific instructions on how and when to respond. The OCR will not look favorably on a delayed response, and if unrequested documentation is submitted, it can be used in all observations and findings.

    Here are some of the areas the OCR audits will cover:

    1. Risk analysis
    2. Evidence of a risk management plan (e.g. list of known risks and how they are being dealt with)
    3. Policies and procedures and descriptions as to how they were implemented
    4. Inventories of business associates and the relevant contracts and BAAs
    5. An accounting of where electronic protected health information (ePHI) is stored (internally, printouts, mobile devices and media, third parties)
    6. How mobile devices and mobile media (thumbdrives, CD’s, backup tapes) are secured and tracked
    7. Documentation on breach reporting policies and incident response policies and procedures
    8. A record of security training that has taken place
    9. Evidence of encryption capabilities

    Question findings if they appear to be inaccurate. Historically, the OCR has allowed organizations to respond to observations and findings. Organizations that have documented all compliance decisions will fare better when trying to defend their position. There are many areas where HIPAA lacks specific direction; the ability to demonstrate a thoughtful and reasonable approach (in writing) will tend to be viewed favorably. 

    By preparing up front and responding in a timely fashion, most OCR audits should progress fairly smoothly. For organizations that have instituted a reasonably compliant security program, there may be little or no follow-up.

    If there are a significant number of observations and findings, an organization may be subject to voluntary compliance activities, or a more in-depth compliance review. Should an in-depth review uncover significant issues, additional corrective action must be taken and/or fines may be imposed.

  • 11 May 2015 11:51 AM | Deleted user

    Here is the latest NORSE Attack Map - May 11, 2015.

    Norse is dedicated to delivering live, accurate and unique attack intelligence that helps our customers block attacks, uncover hidden breaches and track threats emerging around the globe. Norse offerings leverage a continuously updated torrent of telemetry from the world’s largest network of dedicated threat intelligence sensors. Norse is focused on dramatically improving the performance, catch-rate and return-on-investment for enterprise security infrastructures.

Massachusetts Health Data Consortium
460 Totten Pond Road | Suite 690
Waltham, Massachusetts 02451

For more information,
please contact Arleen Coletti
by email or at 781.419.7818

join our mailing list

© Massachusetts Health Data Consortium