How to send emails

From Wikibase Personal data
Jump to navigation Jump to search

Problem Description

The SAR service/feature aims to aid the user in spontaneous initiation of Subject Access Reports towards a data controller. To minimize required effort for this, the feature dynamically creates emails ready to be sent with subject and body texts.

  • The mailto functionality currently used for this provided by different browsers starts misbehaving on Windows 10 / Chrome 73 once the mailto url character count reaches 2048.
  • This is probably a Chrome-Windows interfacing problem, as Firefox can handle mailto urls of much higher length on Windows, and Chrome can as well on iOS for example. See here.
  • The mailto functionality discards body text from Microsoft Edge
  • Using window.location.href = "mailto:" to trigger drafting the email does not affect behaviour

Solution

The problem is solved if the user sets Windows Default App for Emails to Chrome, and sets Chrome email handler to gmail according to google support

Alternative Approaches

  • Showing a preview of the email to be drafted on the website so that it is easily copy-pasteable by the user in case of misbehaviour
  • Sending emails on behalf of the users from server side, or a third party site like MailChimp or PostMark, not using the users credentials. (This raises how does a data controller react when he gets a Subject Access Request from a third party address. Could it happen that the Data Controller replies to the SAR originator address instead of the contact info in the email? Unlikely, as the Data Controller is responsible not to give out private information to third parties.) The main problem with this approach is that the server (ours) sending the emails would very quickly become blacklisted on smtp blacklists as this would allow any anonymous third party to create invalid or maleficient mails being sent out from it. ( Which is probably also true for third party sites as the PD.io users could anonymously trigger aforementioned request, possibly resulting in suspension of service from the provider in case of a series of incidents. This is merely a guess based on that these services are usually provided to businesses reaching out to their consenting users, but this process usually does not involve giving third parties the option to use the system to reach fourth parties. This approach would require filtering for maleficious or more exactly flood-like behaviour on the PD.io side in any case)
  • Sending emails using client side JS using SMTPjs or homebrewing an smtp client on our server that can be used by the user with his own credentials.

This option would involve having to use the user credentials, a very bad idea in general, because it would involve either our server even temporarily storing user credentials as a responsibility, or providing a third party server with the user credentials, an uncontrollable risk.

  • Using email provider APIs like Gmail API. This is a viable alternative, as it involves Oauth, which solves the user credentials problem, but with the disadvantage of tying the user to a specific


Locally Stored User Data for Dynamic Email Generation

To create ease of use for the generating the mail bodys, the site can store persistent reusable user data on the client machine using some feature enabling this. Viable alternatives for now seem like:

Localstorage:)

  • simple / easy to use
  • only synchronous-blocking requests (see Data Access Considerations below)
  • Partially defendable
  • There is separation of access between different domains even if it is a subdomain. This means scripts running on pd.io can not access wiki.pd.io localstorage (that is a good point in terms of security, however, IFRAMEs can be used to circumvent this in some setups )

Indexed:DB

  • somewhat complicated / multiple lines of code
  • only asynch non-blocking requests (see Data Access Considerations below)
  • can store arbitrary types of data if ever required (e.g. pictures)
  • Equal to Localstorage from security point of view, has the same 'same origin policy' which does not make the data available to across subdomains (not even between http and https version for the matter)

SOLID

  • Solid implementation
  • Good source of future word puns to keep up morale for the developers
  • Can be done at lunchbreak, increasing efficiency
  • Security depends on our specific implementation of usage mostly as it builds on top of existing technologies
  • Only Con: very new, we are all rookies.

Data access considerations

  • Except for SOLID, there seems to be no easy way to transfer locally saved content from one browser or computer to another. This means it would be useful to give the user the option to convert anonymous client data into server backed registration data, or rather, just use SOLID.
  • As our predicted volume of used data fields will probably stay in double digits, asynch or synch requests should not make a big difference in performance, however, in general it is good practice not accessing any local data until it is to be read or wrote. Also good practice to read local data into RAM/some js object once it is necessary, and use it from there then on, only touching the local copy of data when writing into it.