Consortium News

  • 13 May 2015 3:19 PM | Anonymous

    Misunderstanding these important tools can put your company at risk – and cost you a lot of money

    Retrieved from  |  May 13, 2015

    You’ve just deployed an ecommerce site for your small business or developed the next hot iPhone MMORGP. Now what?

    Don’t get hacked!

    An often overlooked, but very important process in the development of any Internet-facing service is testing it for vulnerabilities, knowing if those vulnerabilities are actually exploitable in your particular environment and, lastly, knowing what the risks of those vulnerabilities are to your firm or product launch. These three different processes are known as a vulnerability assessment, penetration test and a risk analysis. Knowing the difference is critical when hiring an outside firm to test the security of your infrastructure or a particular component of your network.

     Let’s examine the differences in depth and see how they complement each other.

    Vulnerability assessment

    Vulnerability assessments are most often confused with penetration tests and often used interchangeably, but they are worlds apart.

    Vulnerability assessments are performed by using an off-the-shelf software package, such as Nessus or OpenVas to scan an IP address or range of IP addresses for known vulnerabilities. For example, the software has signatures for the Heartbleed bug or missing Apache web server patches and will alert if found. The software then produces a report that lists out found vulnerabilities and (depending on the software and options selected) will give an indication of the severity of the vulnerability and basic remediation steps.

    It’s important to keep in mind that these scanners use a list of known vulnerabilities, meaning they are already known to the security community, hackers and the software vendors. There are vulnerabilities that are unknown to the public at large and these scanners will not find them.

    Penetration test

    Many “professional penetration testers” will actually just run a vulnerability scan, package up the report in a nice, pretty bow and call it a day. Nope – this is only a first step in a penetration test. A good penetration tester takes the output of a network scan or a vulnerability assessment and takes it to 11 – they probe an open port and see what can be exploited.

    For example, let’s say a website is vulnerable to Heartbleed. Many websites still are. It’s one thing to run a scan and say “you are vulnerable to Heartbleed” and a completely different thing to exploit the bug and discover the depth of the problem and find out exactly what type of information could be revealed if it was exploited. This is the main difference – the website or service is actually being penetrated, just like a hacker would do.

    Similar to a vulnerability scan, the results are usually ranked by severity and exploitability with remediation steps provided.

    Penetration tests can be performed using automated tools, such as Metasploit, but veteran testers will write their own exploits from scratch.

    Risk analysis

    A risk analysis is often confused with the previous two terms, but it is also a very different animal. A risk analysis doesn't require any scanning tools or applications – it’s a discipline that analyzes a specific vulnerability (such as a line item from a penetration test) and attempts to ascertain the risk – including financial, reputational, business continuity, regulatory and others -  to the company if the vulnerability were to be exploited.

    Many factors are considered when performing a risk analysis: asset, vulnerability, threat and impact to the company. An example of this would be an analyst trying to find the risk to the company of a server that is vulnerable to Heartbleed.

    The analyst would first look at the vulnerable server, where it is on the network infrastructure and the type of data it stores. A server sitting on an internal network without outside connectivity, storing no data but vulnerable to Heartbleed has a much different risk posture than a customer-facing web server that stores credit card data and is also vulnerable to Heartbleed. A vulnerability scan does not make these distinctions. Next, the analyst examines threats that are likely to exploit the vulnerability, such as organized crime or insiders, and builds a profile of capabilities, motivations and objectives. Last, the impact to the company is ascertained – specifically, what bad thing would happen to the firm if an organized crime ring exploited Heartbleed and acquired cardholder data?

    A risk analysis, when completed, will have a final risk rating with mitigating controls that can further reduce the risk. Business managers can then take the risk statement and mitigating controls and decide whether or not to implement them.

    The three different concepts explained here are not exclusive of each other, but rather complement each other. In many information security programs, vulnerability assessments are the first step – they are used to perform wide sweeps of a network to find missing patches or misconfigured software. From there, one can either perform a penetration test to see how exploitable the vulnerability is or a risk analysis to ascertain the cost/benefit of fixing the vulnerability. Of course, you don’t need either to perform a risk analysis. Risk can be determined anywhere a threat and an asset is present. It can be data center in a hurricane zone or confidential papers sitting in a wastebasket.

    It’s important to know the difference – each are significant in their own way and have vastly different purposes and outcomes. Make sure any company you hire to perform these services also knows the difference.

  • 12 May 2015 9:52 AM | Anonymous

    retrieved from  
    May 11, 2015 | Mark Fulford, Partner, LBMC

    In March of 2014, the Office for Civil Rights (OCR) announced that HIPAA audits would start in the fall of 2014. To date, no audits have taken place, and as of this writing, the audit program is still on hold. That said, the OCR is gearing up for the pre-selection process and has announced that audits will commence when the audit portals and project management software are completed.

    Like the start-date, the exact number and types (desk vs. on-site) of audits has been in a state of flux. All indicators, however, point to significantly more than the 115 that were selected as part of the pilot audit program of 2011/2012. Participants will include health plans, healthcare providers and clearinghouses (covered entities), and in a second round, a cross section of business associates.

    For some healthcare organizations, submitting to an OCR audit will be challenging at best. The HIPAA audit pilot program revealed an egregious lack of attention to HIPAA rules and regulations across the industry.  As a result, the 2015 OCR Audit participants can expect a particular focus on areas that had the most significant observations and findings in 2012: lack of risk assessments; attention to media movement and disposal; and institution of audit controls and monitoring. 

    But even if an entity has been reasonably attentive to compliance, it still behooves them to do some upfront research on what to expect should they be selected.

    How to respond

    The OCR has not been particularly forthcoming with information on the upcoming audits, so it’s up to individual organizations to interpret what to expect and how to prepare. But the OCR has indicated that — unlike the 2012 pilot program — the audits will be conducted by OCR personnel rather than by a third party. And unlike last time, the audits will lean more heavily toward desk audits, with onsite audits occurring on a case-by-case basis.

    According to information in presentations from Department of Health and Human personnel, here is what audited entities need to be aware of:

    1. Data request will specify content and file organization, file names and any other document submission requirements.
    2. Only requested data submitted on time will be assessed.
    3. All documentation must be current as of the date of the request.
    4. Auditors will not have opportunity to contact the entity for clarification or to ask for additional information, so it is critical that the documents accurately reflect the program.
    5. Submitting extraneous information may increase difficulty for auditor to find and assess the required items.
    6. Failure to submit response to requests may lead to referral for regional compliance review.

    Document submissions will be no small task, so gathering necessary evidence up front will minimize disruption to day-to-day operations.

    Getting ahead of OCR

    Once an organization receives notification, it should immediately mobilize. If subsequently chosen to submit to an audit, participants will only have a short time to respond. The following provides basic steps for a strategic OCR Audit plan:

    Gather a team. Privacy and security officials should be assigned to a task force responsible for handling audit requests. It’s also a good idea to notify internal or external legal counsel to keep them on stand-by should guidance be necessary.

    Follow guidelines on how to respond. The OCR will provide specific instructions on how and when to respond. The OCR will not look favorably on a delayed response, and if unrequested documentation is submitted, it can be used in all observations and findings.

    Here are some of the areas the OCR audits will cover:

    1. Risk analysis
    2. Evidence of a risk management plan (e.g. list of known risks and how they are being dealt with)
    3. Policies and procedures and descriptions as to how they were implemented
    4. Inventories of business associates and the relevant contracts and BAAs
    5. An accounting of where electronic protected health information (ePHI) is stored (internally, printouts, mobile devices and media, third parties)
    6. How mobile devices and mobile media (thumbdrives, CD’s, backup tapes) are secured and tracked
    7. Documentation on breach reporting policies and incident response policies and procedures
    8. A record of security training that has taken place
    9. Evidence of encryption capabilities

    Question findings if they appear to be inaccurate. Historically, the OCR has allowed organizations to respond to observations and findings. Organizations that have documented all compliance decisions will fare better when trying to defend their position. There are many areas where HIPAA lacks specific direction; the ability to demonstrate a thoughtful and reasonable approach (in writing) will tend to be viewed favorably. 

    By preparing up front and responding in a timely fashion, most OCR audits should progress fairly smoothly. For organizations that have instituted a reasonably compliant security program, there may be little or no follow-up.

    If there are a significant number of observations and findings, an organization may be subject to voluntary compliance activities, or a more in-depth compliance review. Should an in-depth review uncover significant issues, additional corrective action must be taken and/or fines may be imposed.

  • 11 May 2015 11:51 AM | Anonymous

    Here is the latest NORSE Attack Map - May 11, 2015.

    Norse is dedicated to delivering live, accurate and unique attack intelligence that helps our customers block attacks, uncover hidden breaches and track threats emerging around the globe. Norse offerings leverage a continuously updated torrent of telemetry from the world’s largest network of dedicated threat intelligence sensors. Norse is focused on dramatically improving the performance, catch-rate and return-on-investment for enterprise security infrastructures.

  • 11 May 2015 11:21 AM | Anonymous

    US-CERT Alert published May 7, 2015

    Systems Affected

    Systems running unpatched software from Adobe, Microsoft, Oracle, or OpenSSL. 


    Cyber threat actors continue to exploit unpatched software to conduct attacks against critical infrastructure organizations. As many as 85 percent of targeted attacks are preventable [1] (link is external).

    This Alert provides information on the 30 most commonly exploited vulnerabilities used in these attacks, along with prevention and mitigation recommendations.

    It is based on analysis completed by the Canadian Cyber Incident Response Centre (CCIRC) and was developed in collaboration with our partners from Canada, New Zealand, the United Kingdom, and the Australian Cyber Security Centre.


    Unpatched vulnerabilities allow malicious actors entry points into a network. A set of vulnerabilities are consistently targeted in observed attacks.


    A successful network intrusion can have severe impacts, particularly if the compromise becomes public and sensitive information is exposed. Possible impacts include:

    • Temporary or permanent loss of sensitive or proprietary information,
    • Disruption to regular operations,
    • Financial losses relating to restoring systems and files, and
    • Potential harm to an organization’s reputation.


    Maintain up-to-date software

    The attack vectors frequently used by malicious actors such as email attachments, compromised “watering hole” websites, and other tools often rely on taking advantage of unpatched vulnerabilities found in widely used software applications. Patching is the process of repairing vulnerabilities found in these software components.

    It is necessary for all organizations to establish a strong ongoing patch management process to ensure the proper preventive measures are taken against potential threats. The longer a system remains unpatched, the longer it is vulnerable to being compromised. Once a patch has been publicly released, the underlying vulnerability can be reverse engineered by malicious actors in order to create an exploit. This process has been documented to take anywhere from 24-hours to four days. Timely patching is one of the lowest cost yet most effective steps an organization can take to minimize its exposure to the threats facing its network.

    Patch commonly exploited vulnerabilities

    Executives should ensure their organization’s information security professionals have patched the following software vulnerabilities. Please see patching information for version specifics.

    Visit here to see the details on and patches for the 30 targeted software threats.

  • 11 May 2015 9:54 AM | Anonymous

    by Pete Herzog
    retrieved 4-28-2015

    An attack takes down the web server. An office worker notices there’s no response and calls IT support. So a member of IT support goes to the server room.

    He sees the power is on and all the network cables look okay. He goes to the keyboard to login and sees there’s no shell. Nothing. Where’s the Operating System?

    He thinks they got hacked. So he freaks out and calls the CISO, “The web server is dead. What do I do?”

    The CISO answers, “Don’t panic, I can help you. First, let’s make sure it’s dead.”

    There is a silence. Then a loud smash is heard. Back on the phone, the IT support person says “OK, now what?”

    * * *

    Tell me your cybersecurity strategy. If you have a head for business you probably just said a few words to yourself. It was short. It was concise. It was more information than sentence. You know your cybersecurity strategy by heart.

    But if you’re a cybersecurity consultant then you’re probably still mumbling your pitch. The thing is that unless you’re in the business of selling cybersecurity products and services, you really only have one cybersecurity strategy: don’t lose money. And it’s an integral part of any modern business plan.

    So what exactly is a cybersecurity strategy? A strategy is a plan with the set of goals and objectives to get a specific result. A cybersecurity strategy is a cybersecurity plan with a set of cybersecurity goals and cybersecurity objectives to get cybersecurity as a result.

    People who are into selling cybersecurity strategies like to say it also includes specifics on tools and metrics. But that’s really just a trick of adding tactics to the strategy so it doesn’t sound so useless.

    Yes, useless. Fun fact for you. A cybersecurity strategy is useless. There you go. A free tidbit for you. Enjoy. If you’re on Jeopardy someday, the category is business and the answer is “useless” then you’ll be a big winner. You’ll thank me.

    Yes, useless…

    A CEO gets lost deep in the mountains after dark. He whips out his trusty sat phone and calls the office to look up his location on a map. A cybersecurity consultant happens to pick up.

    The CEO explains his situation and tells him that he needs the fastest way out of the woods.

    The consultant is heard tapping furiously at the keyboard, mumbling to himself as he thinks out loud, and after some time gets back on the phone, “You need to just fly out.”

    The CEO shouts, “How the hell do you expect me to grow wings and fly out?!”

    The consultants answer, “How should I know? I’m a strategist.”

    * * *

    The truth is that if you don’t have a cybersecurity strategy for your business it’s because you’ve inherently got one. You’ve never bothered to formally make one because it’s so obvious. Like how you don’t have a formal not dying strategy.

    Your cybersecurity strategy would likely say you don’t want threats of any sort affecting your assets of any sort now or in the future. Obvious.

    It’s such a no-brainer that if time-travel were invented next week and criminals could go back in time to rip you off then your cybersecurity strategy would still be obvious enough to also include that you don’t want to lose assets yesterday too.

    And you didn’t have to even write it down. Or pay a cybersecurity consultancy a Monopoly-style wheelbarrow full of money to do so. So if it’s useless, why is there such a focus on a cybersecurity strategy? Because tactics are hard.

    Too harsh? No, appropriately harsh. It’s easier (and safer) to make a cybersecurity strategy sound like something important despite meaning nothing than it is to make tactics that work.

    You look better longer too because a cybersecurity strategy can go on meaning nothing a really long time but tactics that mean nothing get noticed right away. And I mean that in a bad way not a Hollywood starlet way.

    I know it’s no surprise to you but cybersecurity is hard. Not only do we not know all of the possible threats but even if we did we still couldn’t know all of the shapes those threats could change into.

    Like if getting wet is a threat then what form will it take? Will it be snow, encroaching glacier, broken pipe, condensation, mis-forecasted hurricane, or the tears of a CISSP trying to create cybersecurity tactics?

    But knowing about threats and what to do about them is not needed or important in a cybersecurity strategy.

    No, a cybersecurity strategy, for real, looks like this. And this one is really truly for real, and swear-to-holy-stuff looks like this. I copied it just like this from an official cybersecurity strategy and then lightly anonymized and generalized it:


    1. Securing Company systems – Our clients trust our company with their personal and business information, and also trust us to deliver services to them. They also trust that we will act to protect and advance our business interests. We will put in place the necessary structures, tools and personnel to meet its obligations for cybersecurity.
    2. Partnering to secure vital cyber systems outside the company – Our economic prosperity and our cybersecurity depends on the smooth functioning of systems outside the company. In cooperation with partners and clients we will support initiatives and take steps to strengthen our cyber resiliency, including that of our critical infrastructure.
    3. Helping our users to be secure online – We will assist our employees and clients in getting the information they need to protect themselves and their families online, and strengthen the ability of law enforcement agencies to combat cybercrime.

    The Strategy:

    • Reflects our values such as the rule of law, accountability and privacy
    • Allows continual improvements to be made to meet emerging threats
    • Integrates activity across the whole company
    • Emphasizes partnerships with government, business and academe
    • Builds upon our close working relationships with our allies

    Now was there is a single thing in there that REALLY needed to be written down? How many meetings did it take to write that? How much consultant blood money?

    What’s in there?

    • You will use cybersecurity to not lose assets
    • You will use partners with cybersecurity to not lose assets
    • You will help others use cybersecurity with your stuff to not lose assets

    Check. Check. And Check! Got it! The message is don’t lose assets here just in case you missed it or wanna pay someone to tell you that. And do YOU have that? And I’m saying it’s OKAY that you don’t. Because there’s nothing in there that should be a surprise to you. It’s all obvious.

    Super like wearing a cape obvious. And not just obvious but actually illegal to not consider doing things like following “rule of law”.

    Not to mention the bit about values. Seriously, when’s the last time you thought, “Hey, I’m gonna undertake this task here and I’m not going to do it according to my values. Nope.” Assuming you know what your values are.

    Truthfully, I don’t think I can articulate my own values but I’m pretty sure it would take serious, conscious effort to do something that was not my in my values. Then again to express in writing that I will follow my values has no value to the people who don’t know what my values are or can even articulate their own.

    But it’s a plan. Right? We need plans. And a cybersecurity strategy is a plan. Without which we can’t be a cohesive team making solid cybersecurity, right? Right?

    Wrong. You don’t need fluff telling you that your partners and clients and their families need you to have your act together and not lose their assets or them as an asset or their money which is clearly an asset. You know that. And you probably already have that in your business strategy under the heading Don’t Lose Assets.

    But to have a cohesive team making solid cybersecurity you do actually need to outline what you do. Yes, you do.

    And luckily for you, in cybersecurity, that do is to prevent losing assets. And everyone who wants to be in cybersecurity of any kind already knows this and cares about it and is in no way not thinking that their job is the opposite of not losing assets.

    Those cybersecurity professionals aren’t freaking out about the cybersecurity strategy. And telling them is just so not helpful it’s offensive. You see, a cybersecurity strategy is about as effective as someone telling you to calm down and relax when you’re having an argument.

    No, you don’t need strategy. What you need are tactics. And you need to hire the people who know cybersecurity tactics.

    Cybersecurity tactics are the rubber meets the road. They are the match striking the slate. They are literally the packets smacking the server. They are the way you do the thing you do to the things you have to to have cybersecurity. And that’s hard.

    But you don’t need a cybersecurity strategy because you’ve already got one.

    * * *

    All uses of “cyber” in this column are for keyword use only and by no means does the author imply that using such language is appropriate or cool. Furthermore this author does not condone nor deny the use of the word cyber in any way because the author is okay with the word in general, despite its original definition, because language is a living thing and meanings change.

  • 06 May 2015 4:37 PM | Anonymous

    Retrieved from Boston Business Journal  |  May 6, 2015

    Brigham and Women’s Hospital has formed a partnership with a San Francisco-based seed-stage investment fund in an effort to test and potentially integrate digital health startup innovations into the Boston hospital.

    The hospital formed an affiliated medical partnership with Rock Health, and the two organizations are currently in the midst of finalizing plans.

    The partnership is expected to begin this summer and last three years.

    Lesley Solomon, executive director of the Brigham Innovation Hub at Brigham and Women’s Hospital, said the idea is to validate the innovations being funded by Rock Health.

    “We will have the opportunity to collaborate with thought leaders in the digital space developing tech that (has) the potential to dramatically transform health care delivery,” Solomon said. “(We’re trying to figure out) how can we get access to good digital technology that can help us impact patient care.”

    The Innovation Hub helps support internal startups and hosts innovation competitions. Solomon said executives were hopeful that Rock Health, a seed-stage venture fund focused on digital health startups, would also look at investing in Brigham technology, though that wasn't the intended purpose of the relationship.

    “I’m excited,” Solomon said. “For me, Rock Health is a thought leader in the digital space. They have demonstrated that they are committed to helping Brigham entrepreneurs tackle the biggest problems in health care.”

    The startups will be focused around digital health, including devices that connect to the cloud, apps and software platforms, and telemedicine.

    The Rock Health partnership also offers Brigham new access to California startups.

    "It doesn’t limit us from partnering with others, but for us we’ll have the opportunity to talk to the best, work with the best," Solomon said. "And they are based in San Francisco, where we don’t have a presence, so it helps us get access to startups we might not know about here."

    Venture capital firm Bessemer Venture Partners, which has an office in Boston, was a lead investor in Rock Health.

  • 30 Apr 2015 11:37 AM | Anonymous

    Retrieved from  |  Beth WalshApr 29, 2015

    Former National Coordinator for Health IT David Blumenthal, MD, penned a blog in the Wall Street Journal's  "The Experts " addressing the potential for health IT as well as challenges related to interoperability and outdated privacy and security regulations. 

    Now president of The Commonwealth Fund, Blumenthal wrote about various scenarios in which health IT tools and mobile applications could help people track and monitor their healthcare by providing interactive, real-time information. But, those advancements can't happen unless electronic devices can communicate with each other. 

    Many EHRs, mobile devices and personal sensors can't exchange information at this point for a variety of reasons but most importantly because "healthcare organizations are fearful of sharing patients' data since it will liberate their customers to go elsewhere for their care." And,  EHR vendors are "charging prohibitive fees and creating other barriers to information sharing" to make it more difficult for customers to "switch out one [EHR] for another," he wrote.

    Blumenthal also wrote that the current privacy and securitiy regulations were conceived and implemented before the internet existed and therefore, "don't offer adequate protections for the 21st century. "If people can't trust the privacy and security of cloud-based health records, they won't feel comfortable using them."

    The obstacles, "mostly human in the making, can be solved by humans if the will exists. If we find a way, the healthcare future will be far brighter for all of us." 

  • 30 Apr 2015 11:32 AM | Anonymous

    Retrieved from  |  Apr 30, 2015   

    UnitedHealthcare has announced that it will now cover video visits from Doctor On Demand, American Well’s AmWell, and its own Optum’s NowClinic, which is a white-labeled American Well offering. The insurance company pointed out that the average price of a video visit is less than $50, and as part of its coverage for the service its members will still be responsible for a portion of that fee depending on the deductibles, copays and out-of-pocket expenses associated with their specific benefit plan.

    The payor pointed to the growing lack of providers in the US, according to the American Association of Medical Colleges, there’s a shortage of 45,000 primary care physicians. United said it’s especially a problem for those in rural areas, which is where 25 percent of the country’s population resides. This group of people have limited access to healthcare, especially subspecialty care.

    "UnitedHealthcare is developing innovative telemedicine solutions that enable consumers, especially people who live in rural areas of the country, to access quality, cost-effective health care, whether at home or on the go," UnitedHealthcare CEO of the Commercial Group business Jeff Alter said in a statement. "Consumers can save time and money choosing among quality physician groups from the convenience of their smartphone, tablet or home computer at any time of the day." 

    All three offerings allow patients to schedule a video visit via mobile device or desktop with physicians who can discuss and send prescriptions for a wide range of conditions including bronchitis, cough, sinus infection, sore throat, UTIs, vomiting, diarrhea, fever, pinkeye, and flu. Coverage for video visits is currently just available for self-funded employer customers, but UnitedHealthcare said it would cover employer-sponsored and individual plan participants in 2016.

    A United spokesperson told MobiHealthNews in an email that they chose cover video visits from a couple of remote visit service providers to give their members more choice when they seek virtual care.

    Together the three services reach 47 states and Washington, D.C. UnitedHealthcare members can find a list of participating video visits care providers through UnitedHealthcare’s Health4Me smartphone app — it is available on the "Find and Price Care" page. App users will not only be able to browse the provider groups, they will also be able to view the cost of a virtual visit with each contracted provider group.

    Earlier this week, Doctor On Demand announced that MultiCare Health System would offer the company’s video visits service to Washington-based patients. The new service is co-branded as MultiCare Doctor On Demand, and allows anyone in the state of Washington to communicate with a physician about different medical issues.

  • 28 Apr 2015 3:27 PM | Anonymous

    retrieved April 27, 2015  from

    It’s become all too common to read about theft or mishandling of private health data. Whether due to a targeted attack or unintentional breach, entities and individuals within the healthcare system need greater peace of mind that sensitive data is safe and secure. A new international privacy standard for cloud providers — ISO 27018 — brings an effective means to better protect health data. The privacy standard mirrors some of HIPAA’s tenets while providing an all-important third-party audit mechanism.

    While enacted almost 20 years ago and updated recently in 2013, HIPAA still falls short of truly protecting personal data in today’s data-rich healthcare system. The two main provisions of the law change were meant to protect health insurance coverage for an employee following job loss and set standards for electronic transactions involving healthcare data. The latter provision, put into place in 2003, contains the Privacy Rule that governs the use and disclosure of Personal Health Information (PHI). 

    The matter of business associates

    As originally written, the HIPAA Privacy Rule applied to covered entities or, generally speaking, health insurance companies, employer provided health plans, and some healthcare providers. The rule forbids any covered entity from using PHI for marketing purposes without patient authorization. 

    Some aspects of the original Privacy Rule also applied to business associates or the third-party organizations that covered entities’ use when performing their healthcare activities. As the Department of Health and Human Services states, examples of business associates include a consultant who produces utilization reports for a hospital and a healthcare clearinghouse that translates claim data from one format to another.

    In 2013, the law was updated to ensure all aspects of the Privacy Rule now apply to business associates and covered entities. But this expansion is not enough to adequately protect personal healthcare data. The rub lies in who qualifies as a business associate, which may not include all of the technology service providers that manage data behind the scenes. A business associate is required to enter into to a Business Associate Agreement (BAA).

    A BAA creates the legal relationship between the covered entity and business associate. It governs the permitted use of PHI and requires business associates to put into place safeguards to “prevent unauthorized use or disclosure of the information.” The business associate is prohibited from using or disclosing healthcare data in any way that violates HIPAA.   

    In the decade between the issuance of the original and updated HIPAA regulations, the number and types of business associates exploded. In addition, electronic healthcare transactions have increased exponentially. In 2015, the U.S. market for electronic records alone is expected to reach $9 billion. Companies that provide the underlying infrastructure for healthcare data transactions including cloud providers, email systems, and intranet services are also part of the landscape that should protect PHI. 

    While the new HIPAA privacy rule includes language to bring an increasing number of technology service providers under the business associate umbrella, it is not clear whether industry practice has kept pace with the new rule. 

    Call to action: Embrace ISO 27018

    To address this gap and better protect the privacy of PHI, the government must adopt the tenets of ISO 27018.

    Although the standard, as currently written, focuses on Personally Identifiable Information (PII), it can apply a rough benchmark as to how a technology service provider will handle PHI as well. And while the substantive requirements of ISO 27018 may not match the requirements of HIPAA exactly, they overlap significantly. This overlap is valuable, as it can offer third-party validation of these requirements through an audit process. 

    ISO 27018 provides a strong litmus test for entities that handle sensitive information such as PHI — and will help entities pick and choose among technology service providers.

    Technology service providers that have undergone a successful audit for the controls under ISO 27018 can demonstrate a commitment to using the types of security and privacy controls required for handling such sensitive information. 

    Complying with ISO 27018 means public and private sector entities, as well as the individuals who entrust them with their data, can rest easier knowing that their data will not be reused by technology companies without their consent.

  • 23 Apr 2015 9:21 AM | Anonymous

    retrieved April 22, 2015   |   from

    Atrius Health, a Newton, Mass.-based nonprofit multispecialty medical group, is the latest organization to open its own innovation center.

    The nonprofit will invest $10 million in a place that will aim to develop and improve patient-centered care delivery models. Atrius, which is part of a Pioneer Accountable Care Organization (ACO), joins other health systems and hospitals, including New York Presbyterian, Hospital for Special Surgery, and Cleveland Clinic that have created specialized innovation arms of their organization. While those have focused specifically on technology innovations, this one will be more about risk-based sharing models.

    "As the industry continues to move towards increased risk sharing, accountability and transparency around performance, increased patient engagement, and demands for big gains in efficiency, it is requiring us to rethink how we deliver care," stated Daniel Burnes, M.D. Transition CEO of Atrius Health. 

    Atrius says not only will the Innovation Center be an independent business unit within Atrius Health, but will be overseen by an advisory board consisting of membership from the Atrius health board and the organization’s senior leadership.

Massachusetts Health Data Consortium
460 Totten Pond Road | Suite 690
Waltham, Massachusetts 02451

For more information,
please contact us at

join our mailing list

© Massachusetts Health Data Consortium