Adults want control and limits on collection of their information – students deserve the same

As I read the today’s report from the Pew Research Center on Americans’ Attitudes about Privacy, Security and Surveillance, a few of the survey results jumped out at me:  (The results below are from the report linked above)

  • 93% of adults say that being in control of who can get information about them is important
  • 90% of adults say that controlling what information is collected about them is important
  • 88% say it is important that they not have someone watch or listen to them without their permission
  • Most want limits on the length of time that records of their activity can be retained
  • Americans have little confidence that their data will remain private and secure

Also:

  • 93% say it’s important to be able to share confidential information with another trusted person

Does this sound familiar? The debate about student data revolves around many of the same considerations: We recognize that collection and sharing of information can be beneficial in the academic setting, but are concerned with who can collect what data, what permissions are required, how long can the data be retained, and how will it be kept secure and private.

In this survey, adults gave the loud and clear message that they feel it’s important to have control and limits on the data collected about them.  We can expect that the adults that today’s students become will have similar views about their own information.   As children they have almost no capacity to control or limit the information collected about them, how long it’s kept, or how well it’s protected.

It’s up to us — the adults — to look out for their interests, with the same level of importance we place on looking after our own information.

Security Observations of Schoolmessenger’s Secure Document Service

In this post I’ll discuss my observations of the security of Schoolmessenger’s secure document delivery service. Schoolmessenger is a signatory of the Student Privacy Pledge.

[Update: May 5, 2015,  I received word last week of the following security improvements from Schoolmessenger, summarized here and updated inline below.  Thanks to Schoolmessenger for their responsiveness and work toward a more secure service!]

  • Forced HTTPS
  • A return to 256-bit SSL
  • Reduction of Personally Identifiable Information in the notification email and the body of the page

Email TLS is on the roadmap for future work.

[Update: this post was updated on March 20th to add observations of the password-protected document delivery method]

Overview of the service

The Schoolmessenger Secure Document Delivery service can be used by schools to deliver documents to parents.   Parents receive an email with a link to the document, which is served using SSL/HTTPS when the parent clicks the link.  The document can be configured by the school district to be served with or without password protection.  Schoolmessenger’s description of the service can be found at this link.

Summary of observations

The notification email from Schoolmessenger to the parent is sent from Schoolmessenger’s servers without Transport Layer Security (TLS). (Email TLS provides encryption and authentication of email as it traverses the shared network between companies’ mail servers.  I wrote a detailed post on TLS for email last week.).  Though the document itself is not stored in the browser cache, the URL providing access to it is, and the URL is also stored in the browser history. Files sent with passwords reveal the name of the student, school, and contact info of the school administrator who sent the file before checking the password.  If the URL is modified to http instead of https, the document is served without SSL.  Company materials describe the service as using 256-bit SSL but the server hosting the document exclusively uses 128-bit SSL.

I would like to thank and acknowledge the contributions of a colleague who works at a large school district, who collaborated on this analysis.

Notification and response

In early March I requested information about which domain was used to host documents from the Secure Document service and asked for a copy of their Specification Sheet, using the contact form here.  On March 16th I notified the company of the findings included in this report, by mailing it to feedback@schoolmessenger.com, at the advice of a Schoolmessenger call center agent.  I sent a follow up mail on March 18th to the same address.  Schoolmessenger has not responded to my inquiries.

Detailed observations

TLS on outgoing mail 

Email TLS is used to encrypt and authenticate mail as it traverses the shared network between email servers (see my previous post on email TLS for more details).  The Google Safer Email page shows that none of the mail Schoolmessenger sends (via swiftwavenetwork) to gmail recipients uses email TLS.  (Note that Google’s data lags by about 5 days – today’s snapshot shows observations from March 14th.)

Screen Shot 2015-03-20 at 7.37.43 AM

Google’s data is consistent with the headers of the mail I received in my gmail account, which do not indicate that TLS was used.

Screen Shot 2015-03-15 at 8.30.29 PM

URL contains sensitive information and is stored in browser history and cache

The URL for the secure document contains unique keys that identify a document.  In the links I received, there was an 11-character code that appeared to be base64 and a 64 character hexadecimal code.

h t t p s://msg.schoolmessenger.com/m/?s=&mal=

This is counter to OWASP ASVS requirement 9.3

Screen Shot 2015-03-08 at 11.04.55 PM

One reason for this is that the URL can be stored in the browser history, as shown below.  It can be viewed in the history or accessed by using the back arrow.

Screen Shot 2015-03-15 at 8.49.04 PM

The optional password protection largely mitigates this risk.  If it is not enabled by the school district, then the URL in the history gives direct access to the file stored at this URL.  Though this could be stored in server or proxy logs, it is a concern primarily if the service is used from a shared computer where history could be viewed by other users.

The secure document is transferred as the response to a POST request and is not cached by the browser. However the response to the initial URL GET is cached in the browser’s disk cache and results in the URL being stored in the browser’s disk cache.

Screen Shot 2015-03-15 at 8.59.50 PM

Screen Shot 2015-03-15 at 9.41.44 PMAs with history logging, this is mitigated for files that are configured by the school to require passwords for viewing and is primarily a concern only if the document is viewed from a shared computer.

Documents sent with passwords reveal information before the password is entered by the user, with limited controls on brute force password attacks  [Update: this section was added on March 20th, after the original report was posted]

[Update 5/5/2015 – Schoolmessenger has notified me that there is now less PII in notification emails and page text]

If a document is distributed with the optional password enabled, the page served by the link sent to the parent reveals information before the password is entered or verified.  The page includes the full name of the student, the name of the school district, and the name and email address of the school official who sent the document.

Screen Shot 2015-03-20 at 5.44.47 PM

In a quick check of protections against brute force attempts on the document password I observed no rate limiting between requests. After about 1000 attempts the site rejected new attempts for under a minute then started accepting them again.  According to the company’s literature the password can be chosen from fields of the document – this knowledge could be used to constrain the set of passwords used in a brute force attempt.  For example, 1000 attempts is about 3 years’ worth of birthdates.

Site allows the document to be served with http/no encryption if the url is modified to http.

[Update 5/5/2015 – Schoolmessenger has notified me that the site now forces HTTPS]

If the url is modified to use “http” instead of “https”, the document is served using http, without encryption.  The proxy log below shows the PDF file being transferred using http.  Redirecting http requests to https, and the use of HTTP Strict Transport Security (HSTS) would prevent this from happening.

Screen Shot 2015-03-15 at 9.49.18 PMProduct description materials describe 256-bit SSL, servers implement 128-bit SSL.

[Update 5/5/2015 – Schoolmessenger has notified me that the site has returned to 256-bit SSL]

Though this is not a significant security risk, it is worth noting that the SSL implemented by the service does not appear to match the product description on the company website.  One example of this is shown in the screenshot below, from the document at this link.

Screen Shot 2015-03-15 at 9.54.59 PMIn my browser I observed that the document download page was served with 128-bit SSL

Screen Shot 2015-03-15 at 9.57.41 PMQualys SSL Labs TLS check shows that the msg.schoolmessenger.com domain’s SSL is configured to exclusively use 128-bit encryption

Screen Shot 2015-03-15 at 10.10.04 PM

For reference, the summary of the Qualys SSL Labs TLS check is shown below

Screen Shot 2015-03-20 at 7.33.20 AM

Email Transport Layer Security, Educational Sites and Apps, and Schools

This post will focus on Email Transport Layer Security. This is often just called TLS and sometimes ‘starttls’.  As the name implies, TLS protects the contents of traffic as it travels between endpoints on the internet by encrypting the content and by verifying the authenticity of the servers at the endpoints.   The SSL/HTTPS used by browsers is a form of TLS that protects traffic flowing between browsers and web servers.

Email TLS protects the contents of emails as they travel between mail servers.  For example if an educational service (let’s call it SuperDuperReaders.com) sends a class update to a teacher’s Google for Education email account, TLS can be used to encrypt the email as it travels from SuperDuperReaders.com’s email server to Google’s email server.  (The Gmail interface will handle encryption between Google’s servers and the teachers email client).  Google’s Safer Email site has more information about how this works, and I’ll refer to this site again later in the post.

This is back-end stuff and a remote threat compared to sending the same content from a mail server to a user’s email client without encryption — for example if the user is using an airport wifi to read the email and it’s not encrypted.   However, maintaining a comprehensive security program means being mindful of potential risks to the security of information and systematically taking steps to protect against those risks.  Another thing to consider is that sometimes a number of small vulnerabilities can be leveraged to create more powerful exploits.  Furthermore I am not aware of a convincing reason for not supporting TLS for email/starttls.   Security is difficult. It’s important to enable the “easy” protections when they are available.

One more detail of email TLS is that both ends of the connection must support it, and it is up to the server sending the mail to request that it be used for the transfer of information.  (This is where ‘starttls’ comes from, it’s the name of the request for TLS from the server sending the mail).  From the previous example, if SuperDuperReaders.com wanted to send email to a teacher’s Google Apps for Education email account using TLS, SuperDuperReaders would initiate the connection with Google and ask for TLS with a starttls request. Google’s server would reply that it’s supported by their server and the connection encryption would be set up.

Returning to Google’s Safer Email site:  At this site Google posts daily percentages of mail sent and received using TLS for the domains with the most traffic to and from Google’s mail servers.  For mail coming in to Google, the mail will only be sent with TLS if the other site (SuperDuperReaders for example) requests it.  Google always requests TLS for email connections, so for mail leaving Google it will only be sent with TLS if the receiving domain supports TLS.  The Safer Email website allows searches by domain name, and the data set can be downloaded from the site.

As an example, the screenshot below shows the results for edmodo.com from today’s dataset.  Google reports that 0% of the mail entering Google from Edmodo is sent with TLS. This means that Edmodo’s servers are not requesting TLS when sending email.  Outbound mail is not listed because not enough mail flows from Google to Edmodo to be recorded in this dataset.

(UPDATED: On March 11th Edmodo enabled STARTTLS for outgoing email, and this should show up in the Google Safer Email results by March 16th or 17th, as they lag by 5 days or so – thanks to Edmodo for quick response!)

Screen Shot 2015-03-10 at 3.39.24 PM

I reported this to Edmodo in June, 2014 and at that time I generated a mail from Edmodo by entering a post in a test classroom, and examined the email headers to confirm that TLS was not used during transmission from Edmodo to my email account. The text of the email is shown below.

Screen Shot 2015-03-10 at 3.50.34 PM

Though the message text is admittedly somewhat contrived, it does demonstrate a case where communications sent with an expectation of privacy, containing personal information of students, was sent across a portion of the network without encryption.  (Not real students in this example.)  Emails from other services might contain more sensitive information than what is in this example.

Google’s site is a way for end users to understand what companies are doing with email TLS without having to decipher email headers.  It can also be used to check what school districts are doing if they send enough mail to be included in Google’s dataset.  For example a data set downloaded today contained the domain names of some school districts and educational organizations that appear to be sending email without requesting TLS from the receiving side. These include.

browardschools.com, cherrycreekschools.org, fultonschools.org, charterschoolsusa.com

A search on domains with ‘k12’ that have 0% incoming TLS yields this:

beaumont.k12.tx.us, bluevalleyk12.org, cms.k12.nc. usmcms.k12.nc.us, cobbk12.org, dcsdk12.org, dpsk12.org, elkriver.k12.mn.us, forsyth.k12.ga.us, greenville.k12.sc.us, henrico.k12.va.us, jeffco.k12.co.us, mason.k12.oh.us, sdhc.k12.fl.us, sheboygan.k12.wi.us, worthington.k12.oh.us

Note that Google’s Safer Email data set changes day to day based on the amount of email traffic seen by google per-domain.

For those inclined, here is how to check if email in a gmail inbox was sent to Gmail with TLS:

Select the “show original” menu item for the mail you want to check.

Screen Shot 2015-03-10 at 4.54.49 PM

This will bring up a view of the email header.  Look for “TLS” in the text.

For example the here is part of the headers for the mail above, confirming it was sent from Yahoo to Google using TLS.

Screen Shot 2015-03-10 at 4.51.17 PM

For completeness, here is the similar portion of headers for the mail from Edmodo in June (shown earlier in the post), sent without TLS from Edmodo to Google.

Screen Shot 2015-03-10 at 4.46.13 PM

(updated after posting to add k12 domains to the list of school district domains with 0% TLS)

Infosnap Security Observations: (Updated 6/7/15 to reflect many security improvements)

In this post I’ll be reviewing the security practices of Infosnap.  Infosnap is a signatory of the Student Privacy Pledge.

UPDATED: June 7, 2015: Infosnap has made numerous improvements to the service’s security, as described below. Over the past few weeks I have had a very open and collaborative interaction with Infosnap to discuss the many improvements, and verify them from my side.  Many thanks to everyone at Infosnap for making these changes and for enabling the open communications about them.

The changes are listed here and also updated inline. I have independently verified all of these.

Application Software

• Changes were made to the user login process to provide identical responses to invalid username and password combinations. This was done to make it more difficult to test potential usernames for validity.

• Changes were made to the user login process to obfuscate the email address of the account holder. This was done to reduce the sensitive information displayed to the user.

Network

• Changes were made to the session authentication token to set the “httpOnly” flag. This was done to prevent an XSS (Cross-Site Scripting) exploit from gaining access to a session cookie.

• Changes were made to the session authentication token to set the “secure” flag. This protects against accidental transmission of the session cookie across an unencrypted link.

• Changes were made to invalidate the session authentication token at logout. This was done to prevent the unauthorized reuse of a session cookie.

• Changes were made to examine headers for the X-Frame-Option. This was done to prevent frame sniffing attacks.

• Changes were made to reduce server details included in response headers. This was done to reduce the amount of internal system information provided.

Snapcodes

• A change was made to the recommended procedure for utilizing Snapcodes. The standard procedure is to require both Snapcode and Date of Birth verification for authorization.

• Changes were made to the Snapcode authentication process to limit the number of times a user can provide an incorrect Date of Birth before the Snapcode is invalidated. This was done to make brute force attacks more difficult.

Notification and response

Update: In May 2015 Infosnap notified me of many security enhancements as described in the updates to this post.

I first reported these observations in this post to Infosnap in August, 2013.  Infosnap responded that the issues could not be addressed at that time without causing significant impact and potential usability issues to its user base.  I reported them again in August 2014 and received a similar response.  The Student Privacy Pledge site does not show the dates that companies signed but the Wayback Machine’s Jan 20th snapshot shows that Infosnap had signed by then.  In March 2015 I reconfirmed most of the observations outlined below.  Since March Infosnap has made numerous security improvements as described in the updates to the post, and as captured in the updated Web App Security Test Plan results included at the end of the post.

Overview of the service

Infosnap is a school registration service.  Schools pre-load some information and Infosnap emails an access code, called a “snapcode” to parents by email.  Parents use this snapcode to access the student’s account by associating it with an Infosnap account, and complete the registration.  An excerpt of an email sent to a parent is shown below.

Screen Shot 2015-03-08 at 12.44.02 AM

The school preloads information including a student’s name, address, and date of birth, the student’s parents and their info from its records.  When parents access the record they can update this information,  add student medical information, and select directory sharing preferences .  The screenshot below shows some of the personal information that is preloaded to the service.

Screen Shot 2015-03-08 at 10.31.44 PM

The next screenshot shows the information about medical conditions that is collected by the service.

Screen Shot 2015-03-08 at 12.55.16 AM

Security observations

The first person to present a snapcode gets access to the records, with no double checks against email addresses known to the school

Update 6/7/15: This is mitigated by the changes to make birthdate verification the default, and to lock out a snapcode after 4 incorrect birthdates are presented by the user  

When a snapcode is presented, the user logs in (if a returning user) or creates a new account, and the student record is associated with that account. The account usernames are email addresses, and there is no verification that the student record is being accessed by the owner of an email account known to the school for that student.  This means that anyone who obtains a snapcode can access the student record if it is presented before a parent uses it to register the student.  There is a feature that the school can configure to require the student’s birthday before gaining access, but I observed that it is not rate limited and can be quickly bypassed with a script or “fuzz tool” that automatically cycles through possible birthdates.

Snapcodes are sent in email and included in URLs

As shown above, snapcodes are sent to parents in emails.  Handling of these codes is very similar to handling of passwords, as both can be used to gain access to information protected by the code or password.   Sending passwords in emails is documented to be counter to best practices, and snapcodes should be treated with similar care.  This is discussed in the OWASP Application Security FAQs, which states: “Emailing the actual password in clear text can be risky as an attacker can obtain it by sniffing. Also the mail containing the password might have a long life time and could be viewed by an attacker while it is lying in the mailbox of the user. Apart form the above threats, a malicious user can do shoulder-surfing to view the password or login credentials.”

The embedded link in the email contains the student’s snapcode in the URL.  This is also against best practices because the URL may be logged in browser history, or server logs.  This requirement is part of the OWASP Application Security Verification Standard and the text of the requirement is shown below.

Screen Shot 2015-03-08 at 11.04.55 PM

Valid snapcodes can be used to extract student and parent information without logging in to an account

Update: 6/7/15: changes have been made to obfuscate the parent’s full email address on all interfaces

If a parent has already registered and “claimed” the snapcode, an attacker in possession of the snapcode could still extract considerable information by attempting to attach the snapcode to another existing account controlled by the attacker. This information includes the name of the school district (served in a banner at the top of the page), the child’s first name and birthdate, the parents email address, first name and last initial.

After entering the snapcode to the account, the service asks for the student’s DOB as an additional check (optional and controlled by the school).  The dialog that asks for this includes the student’s first name.

Screen Shot 2015-03-08 at 11.19.28 PM

If the birthday presented is incorrect, the user is prompted to try again. I observed there were no rate limits or attempt limits on this check.  As described above, an attacker could use an automated tool to quickly bypass this check, and in the process discover the student’s birthdate.

Once past the birthdate check, the potential attacker would be presented with a page that reveals the parent’s first name and last initial. The email address is obscured but can be obtained in another way that I’ll show later.

Screen Shot 2015-03-08 at 11.23.58 PM

If a valid snapcode is presented that has already been “claimed” and a potential attacker selects the option to create a new account, the parent’s full email address is revealed.

Screen Shot 2015-03-08 at 11.29.00 PM

Consider that among other things, an attacker could construct a phishing attack using all of this information (school/district name and logo, student and parent first name, parent last initial, student birthdate, parent email address)  to make the malicious email or web link look legitimate.

Authentication token and session handling

Update 6/7/15: The session cookie’s httpOnly and Secure flags are now set, and session cookies are invalidated when a user logs out.

Note: this is more technical than the above sections, but important. An attacker in possession of a user’s session cookies can potentially control the user’s account and view all the information it holds.  Broken session management and authentication controls are #2 in the latest OWASP Top 10 list of web app vulnerabilities.  Refer to my earlier post on Session Cookies for more information.

Several best practices are not followed with respect to protecting session authentication tokens, which are codes that give a user access to an account after presenting username and password.  Problems include:

  • Not invalidating a session authentication token when a user logs out
    • This means that if an attacker obtains the token, the user does not revoke the access by logging out
  • Not setting the ‘secure’ flag in session authentication token cookies or using strict transport security. These protect against accidental transmission of session authentication tokens over unencrypted links
    • secure flag prevents the cookie from being sent over an unencrypted link.
      • it’s not uncommon to see this flag not set due to requirements of load balancers in the infrastructure, but then strict-transport-security must be used, and it is not.
    • strict-transport-security tells a browser to always connect to a site over an encrypted link even if the url is presented as http
  • Not setting the httpOnly flag in the session authentication token cookies, which could allow them to be read by a malicious script and sent to an attacker.

The image below shows that the httpOnly and secure flag are not set on the FamilyPortalAuthentication cookie. This is the one that grants access to a user’s authenticated session.

Screen Shot 2015-03-09 at 12.02.53 AM

Updated: Full end-user test plan results

Full results of end-user security testing, repeated in June 2015 to capture numerous recent security improvements, are in the embedded report.  (The original results from March follow below this latest version.)

June 2015 results:

March 2015 results:

O’Reilly Fluent Conference: Web App Testing For Everyone

I’m happy to announce here that I’ll be speaking about web app security testing at this year’s O’Reilly Fluent Conference (April 20-22, SF) .

Web applications are under constant attack and intrusions and data breaches are on the rise. Though attacks can be complex and sophisticated, many of the most common vulnerabilities are straightforward to observe and exploit.

In this presentation, Tony Porterfield will describe ways for users without extensive security experience to test for common vulnerabilities in web applications using only a browser and free software tools. These techniques will be illustrated with examples of actual vulnerabilities that he has observed while testing educational web applications. He will present a test plan that can be used to survey a site’s security in a short amount of time, and describe how it relates to the OWASP ASVSand Top 10 list.