Chemical Synthesis in the 21st Century: Dial-a-Molecule Annual Meeting 2015 Report

“Chemical Synthesis in the 21st Century” was the third running of the Dial-a-Molecule Annual Meeting series.  Returning to the University of Warwick, the meeting was held on the 30 June and 1 July 2015 and was attended by 66 delegates over the two days.  The meeting included a number of different elements, including keynote lectures, updates from Dial-a-Molecule supported projects, posters, an exhibition showcasing technology developed within the Network and discussion sessions designed to implement and advance objectives set out in the Roadmap.

Professor David Procter (University of Manchester) delivered the first of the two keynote lectures, describing research his group is undertaking towards identifying new methods for target synthesis by utilizing copper catalysts and metal-free cross-coupling reactions.  Dr Rebecca Goss (University of St Andrews) followed in the afternoon with a very informative account of how synthetic chemistry can be combined with synthetic biology techniques to ‘Dial-a-Molecule’.

After Gill Smith (Project Coordinator) gave an overview of the three Dial-a-Molecule inspired Manufacturing the Future projects, Prof Asterios Gavriilidis (UCL) and Prof Kevin Booker-Milburn (University of Bristol) provided updates on the projects they respectively lead: “Sustainable manufacturing in multiphase continuous reactors: Aerobic Oxidations” and “Factory in a Fume Cupboard: Reagentless flow reactors as enabling techniques for manufacture”.

Since the 2014 Edition of the Annual Meeting, Dial-a-Molecule has supported a number of small projects centered on advancing areas within the Roadmap.  Dr Thomas Chamberlain (University of Leeds) gave an update on the project looking at developing carbon nanoreactor stabilized nanoparticle catalysts, and Dr Bao Nguyen (University of Leeds) described the progress his team have made on electrochemically switchable catalysts in flow.  Dr Natalie Fey (University of Bristol) provided a progress report on the collaborative projects she is involved with, looking at the use of descriptor-led ligand screening in organometallic catalysis. Dr Richard Bourne (University of Leeds) rounded off the mini-project updates with his work on optimizing reactions using statistical designs – both in the research and undergraduate laboratories.  Prof Steve Marsden (University of Leeds) closed the meeting with an introduction to the European Lead Factory, and explained the opportunities available for academics to become involved.

A highlight of the meeting for many was the exhibition which showcased technology developed in research labs across the Network.  Each research consortium was given 4 minutes to give a quick introduction to the technology, which was followed by a hands-on exhibition.  There was a wide variety of items on show, including electrochemical flow cells and reactors, calorimeters, photochemical reactors, 3D printed flow reactors, mechanical grinding jars, wireless sensors and PCA maps developed for solvent selection.

A unique aspect of Dial-a-Molecule meetings is the breakout discussion sessions that centre on specific objectives of the Roadmap.  Seven of these sessions were held over the two days:

Big Data and Predicting Reaction Outcomes (Chair Prof Richard Whitby)

Following talks from Richard Whitby (University of Southampton), Simon Coles (University of Southampton) and Jonathan Goodman (University of Cambridge), discussions centered around two topics:  How to get reaction data and how to use it.

The breakout concluded that to collect reaction data, the process has to be easier however different sorts of science have different needs so a ‘one-size fits all” solution may be difficult to find. Importantly people have to be motivated to record the data in the first place, and this may be difficult to do if there are no immediate uses for the data. Suggestions to overcome these problems included making the most of, and increasing the use of, technology in labs. Along these lines, suggestions included using ELNs to directly export data into a central curated system, or recording at the point of collection – be this inline data collection techniques or voice-recording devices in fume cupboards.

The way in which organic chemists are trained to collect data also needs to be revisited, and lessons can be learnt from chemical engineers.  For example, the chemist rarely records the size of the vessel the reaction takes place in but this is standard in engineering disciplines.  Volumes of solvents are rarely correctly recorded with true accuracy either.

To overcome the issues surrounding reaction data, there needs to be agreed standards and it needs to be an International effort.

Towards a National Catalyst Collection (Chair Prof. Steve Marsden)

Steve Marsden (University of Leeds) presented his thoughts on a National Catalyst Collection, including proposals on what it would contain, how it would be curated, how it could be accessed, and the rewards that could be expected.

Feedback from the Academic and Industrial audience was very promising, and a number of suggestions as to how this could be successfully implemented were proposed.

New Reactions with Impact (Chair Dr John Clough)

John Clough (Syngenta) presented an introduction to the area from an Industrial perspective and suggested the following points for discussion:

  • How can we quickly invent new “reactions with impact”?
  • How can we recognize reactions with impact at their first (often modest) publication?
  • How can we identify “footnote reactions” in the literature, and then investigate those that have the potential to become reactions with impact?
  • If you could invent one new reaction with impact, what would that be?
  • If a reaction were to be named after you, what would you like it to do?

It was suggested that new reactions are typically discovered by 1 of 3 routes.  The first, by pure design, is relatively rare as to rationally design a reaction from the beginning takes a tremendous amount of knowledge and skill.  The most common way new reactions are “invented” is by serendipity.  Lastly, there is forced/evolved serendipity, which has been pioneered by David MacMillan (Princeton) e.g. high throughput reaction development.  “Optimizing serendipity” was seen as a way of inventing new reactions.  Mechanisms to do this included a “Journal of Odd Results”, or a database of odd results.

To recognize reactions with impact, the selection criteria needs to filtered down to the Postgraduate or Undergraduates who are doing the hands-on work.  In this way, they will be able to recognize an important ‘odd’ result to investigate further.  Guidelines for criteria could come from, for example the “What Industry Want” paper or an online database/virtual ‘wish list’ of reactions.

C-H activation was recognized as a particularly transformative reaction especially CH/CH coupling (with loss of H2) or C-H/C-OH coupling (loss of H2O).  Others included safe and simple oxidations using ozone or singlet oxygen, heteroatom functionalization (e.g. hydroxyamination), utilizing cellulose/lignin as a source of fine chemicals in the post-petroleum era, transforming flat molecules into 3D-structures (increasing the amount of sp2/sp3 hybridization), and cutting down on the use of protecting group chemistry.

Lab of the Future (Chairs Gill Smith & Harris Makatsoris)

Following on from presentations delivered by Duncan Browne (Evolution of automated liquid/liquid extraction – simplicity as a driver) and Ian Fairlamb (Optimizing metal-catalyzed cross-coupling processes with a Chemspeed Platform), discussions centered on gaining an understanding of what the ‘ideal’ future landscape would look like (locally, regionally and nationally, equipment and skills), how the needs can be prioritized and what routes can be used to take the output forward.

In general:

  • Need to be clear what you want to achieve before you can decide ‘must haves/nice to haves’
    • Key candidate would be screening – of catalysts, substrate scope etc.; possibly screening of work-up and isolation conditions
    • Every lab or department needs its own business case based on what it wants to achieve, which will be different to the Mid-range facility
  • Business case must give some indication of Return on Investment
    • Can we quantify savings in research time per ÂŁ1 spent? What is the value of getting more done with the saved time?
    • What is the value of the improved/increased data?
  • Data collection is application specific, so templates for each application area should be available
  • Automate the mundane
  • Must have expert staff (not students or temps) dedicated to looking after the kit in a high tech lab
    • Need to have a critical mass, so as not vulnerable when staff move on
    • Multi-disciplinary expertise needed

Every lab should have:

  • Easy ‘out-of-the-box’ kit which builds on what’s already available with manual, parallel capability (e.g. Carousel reactors)
  • Reaction screening equipment, to include temperature monitoring, mixing, conversion vs. time
  • Better data collection e.g. wireless sensors
  • Tablet with ELN for every researcher
  • In 10 years’ time would like to have the ‘Synthesis Replicator’ as a universal synthesiser
  • In 20 years’ time would like to have a ‘Tricorder’, to scan a sample and get all the related information on it
  • Could start by buying on eBay, Dove Auction etc. BUT this does not move us into the future just brings us up to the recent past
  • Consider collaborative purchasing across Universities to leverage spending power -not easy to do but that doesn’t mean we shouldn’t try to do it!
  • Equipment sharing across Universities.  Needs good, well curated database.  Some current examples e.g. GW4, but not well populated therefore of little use

Mid-range Facility/Central Service:

  • Needs to be well stocked with standard sets/libraries of reagents, catalysts etc.
  • Manned by expert staff who understand the equipment and the ways of working to get the most out of it
    • Spread best practice
    • 2-way – people can come to them or they can go out
    • Engage with academics and discuss their needs before bringing the work in
  • Modular ‘plug and play’ equipment
    • Could bring in a reactor from outside and use it with the analytical/ data collection kit on-site
  • Automated equipment: batch, parallel reactors, solids handling, liquid handling, flow reactors (some debate, as it is common practice to develop a reaction in batch, then convert to flow – maybe the conversion to flow is done back at base?).
  • Ability to handle multi-phase reactions
  • Multiple analytical techniques and real time analysis (all analysis should be real-time, with minimal sample preparation if possible to avoid sample stability issues)
  • Data handling expertise
  • Must provide training/education to the academics who will make use of the facility

ELNs: Adoption and Data Standards (Chairs Richard Whitby and Aileen Day)

Kiera McNeice (RSC) presented “Why aren’t using ELNs yet?” which commented on the recent RSC project which identified ELN user requirements for synthetic organic academic chemists, evaluated some current ELNs on the market, and concluded that the RSC wouldn’t be developing their own since commercial ELNs are now at a high standard to meet those needs, and instead asked the question why aren’t academics using them yet, and what can we do about that? Demonstrations from ELN vendors MestreLabs and Dotmatics followed. The discussion suggested the following measures:

1) Don’t call them ELNs – at least call them Electronic Lab Notebooks as ‘proper’ chemists might not know what an ELN is, and bear in mind other conversations about calling them other alternatives.

2) Address the issue of the barrier of time required to evaluate ELNs. Some people have told Dial-a-Molecule “If you tell us which to use, we will use it”. To pick which ELN (or more than one) the following was suggested:
Gather user requirements for academia into one document (to be published?), maybe taking into account feedback from “champions” who can later spread usage in their department and who can also test ELNs; pass these onto the ELN companies to see if they currently meet those requirements or could do in the future, and subject them to a tender-like process. Increase community awareness of which ELN(s) meet those needs and their features.

3) Additional functionality/features of ELNs that might help uptake e.g.:

  • The ability to record different TLCs over time (e.g. paste them into the relevant place within a procedure)
  • Make them able to deal with MOF repeat units and more complex materials (e.g. mixtures)
  • Store crystal structures in them and give the ability to search on their parameters
  • After you’ve done a reaction and got a yield, allow back-calculation to generate the required amount of starting products to get e.g. a target amount of product
  • Allow e.g. kinetics diagrams to be loaded into (or created in?) the ELN
  • Identify other functionality by paper prototyping – trying to reproduce paper lab notebooks

4) Educating academics to the advantages that ELNs can bring:

  •  If supervisors are geographically separated from the lab (e.g. at conferences) and need to sign off experiments
  • Ability to deal with chemical inventories e.g. to subtract the amount left from total amount if you use some in an experiment, or to identify who in your department has a particular compound

5) Look at pricing models e.g. make it easy for individual groups to try it and drop costs for bigger groups or more groups in a department and allow the progression to discounted institutional licenses easily

6) Let the next generation (more technology-oriented) have more of a say – PhDs etc.

7) Allow all data to be easily exported in useful format (e.g. directory-organised pdfs) at any point

Aileen Day (RSC) presented “S88 – a way to capture reaction procedures in a standardised form” and reported the progress of the group of instrument vendors, pharmaceutical companies and other interested parties in defining implementation guidance (xml schema, examples, documentation, common terminology) to capture reaction processes in a repeatable way. The following points were made when discussing the applicability of the approach so far to academic bench chemists (in contrast to the input to date from downstream plant-level production chemists):

1) Academics in the group looked doubtful at filling in a stepwise procedure in the kind of detail suggested, but acknowledged that current levels of process capture aren’t reproducible. They would be more likely to do this if:

  • There were tools to guide them through putting together a reaction process plan and specify what parameters and additional inputs to fill in when executing each step e.g. timestamp, observation, upload file e.g. TLC etc.
  • There were dropdown lists to choose from for actions, parameters etc. as suggested to make it easier to fill in a process
  • Allow cloning of previous reaction processes (with links back to cloned source reaction processes)

2) There was minimal chance for discussion of suggested lists of actions, parameters and operation but one comment was to break down “agitate” into different values: “stir”; “homogenise”; “shake”; “vortex (swirling)” and break up “stirring” into “magnetic” and “mechanical” and assign e.g. parameter “rpm” to “stirring” action


Enabling New Chemistry with Flow

Following on from a presentation by Rob Green (University o f Southampton) “Electrosynthesis in flow – an overview”, discussions centered around the following questions:

1) What we can and cannot do using flow chemistry?

  • Any reaction that uses solids or forms precipitates is difficult to perform, although not impossible.
  • Temperature gradient along the flow path can be difficult to control at high resolution.
  • Solid packed-bed reactors can be problematic on scale, particularly regarding uniformity of the catalyst bed, hot-spots and possible side-products.
  • Catalyst deactivation and leaching are still problematic with complex catalysts, and better immobilization/reactor technologies need to be developed.

2) What reactions do we want to do?

  • Cascade, multi-steps reactions.
  • Reactions involving short-lived, hazardous chemicals for safety reasons.
  • Biovalorisation of biomass materials.
  • Syntheses of organometallic catalysts which are traditionally highly air and moisture sensitive.

3) What are the barriers for the wider application of flow chemistry?

  • Achieving efficient mixing and control of reaction conditions is not easy for non-specialists.
  • General understanding of flow chemistry is not wide spread and many members of the synthetic community still consider flow chemistry a niche area of research.
  • Prohibitive start-up cost for newcomers.
  • ‘Home-made’ kits are much better and more flexible. However, they require technological know-how regarding pumps, control of systems, integrating kit components, and software. These are not currently available to typical chemists.
  • Recommended actions:
    • ž   Support groups
    • ž   Open access instructions, software and designs (RSC?).
    • ž   Higher standard in description of equipment in ESIs.
    • ž   Incorporation of flow chemistry into undergraduate chemistry training.

4) When do we move batch reactions into flow?

  • Should be driven by economic benefits.
  • Safety issues.
  • High throughput and small footprint constraints.
  • When one has enough online and inline analytical tools.

5) What is currently difficult in continuous downstream processing and what do we want to see developed?

  • Chromatography is still difficult (although in principle we should not need it).
  • Crystallization of products from reaction mixture. This however can be done in CSTR mode.
  • Cheaper quenching methods, acid/base resins are still desirable.

6) What are the barriers for integration of continuous processing technologies?

  • None of them are commercialized yet.
  • Needs a universal platform and interface to allow processes to be moved between different pieces of equipment on different scales.
  • Chemists need to be able to interface different instruments with computers. Generally, the required skill-set is much better addressed through interdisciplinary collaborations.


Statistical Methods in Chemistry Research (Chair Gill Smith)

Dial-a-Molecule have been especially active in promoting the use of Statistical Methods (e.g. DoE, PCA analysis) in mainstream academic chemistry courses and research.  As a part of this initiative, we have supported the development of UG laboratory modules.  This session gave an overview of the lab module (Richard Bourne), including student feedback, and discussions were centered on how this module can be adapted/transferred to other Universities across the UK.

Discussion and feedback on the Lab Module:

  • Introduction to the theory based on industry experience and simulations (has been provided by Martin Owen – GSK or Brian Taylor – AZ in the 2 runs so far)
  • How do we provide this to others?
    • Could video Brian or Martin
    • Try to call on the goodwill of our industry contacts
    • Try to enable those running the module to deliver it themselves
    • Involve statistics departments at the Universities
  • Theory and practical need to be close in time – with CDT students, the whole module was complete in a day, 9:30-6:30
  • May need to include some basic stats
  • Need to allow flexibility in how to run it
  • Undergrads split the tasks equally with each doing a bit of everything
  • CDT students chose to take roles
  • UGs ran over several lab sessions cf. CDT students in 1 day
  • To reduce errors need to consider making up solutions, pre heating water baths etc.
    • UGs had a lot of spread in their results – but at the end they managed to identify their sources of error
    • Could expand to include gathering/calculating physical organic data for the reactions
    • Infrastructure needed to support running the module
  • Would be beneficial to provide the basic kit – syringe pump(s), tubing and sample port (could this be provided by RSC cf. Spectroscopy in a Suitcase?)
  • Expert support
  • ‘Wiki’ – internal, or owned by RSC
  • Paper in preparation for J. Chem. Ed.

Pilot workshop to enable others to run the Lab Module:

  •  2 days – For future workshops may want to consider running the first day for those who know nothing at all and the 2nd day bring in others who already know the basics but want to know about the lab module
  • Aim for 8 attendees from 4 institutions (one lead academic, one post-doc or teaching fellow from ‘keen’ universities who will lead by example)
  • Include a run-through of the lab module
  • Introduction to the theory and practical
  • Before the run-through, start with more on theory (those deploying/running the module need more knowledge than those taking it)
  • Content should be software neutral
  • Could consider running a workshop for a Teaching Fellows’ conference