So far at least, if the polling above has this right, there is no real desire amongst the public in the UK to ease some of the lockdown restrictions. Every suggestion of things being eased in the poll found the majority of respondents wanting things to stay as they are.
Comments
Oh, is that a first??
https://epcced.github.io/ramp/
Much more practical is back to work in construction, offices, courts (please) and other environments where social distancing can largely if not completely be maintained.
Too complicated as it's quite likely to be a phased/limited/geographical/age-related relaxation of the shutdown.
I'm about to leave home for a walk (1,000 sq ft flat) for the first time in 25 days.
Shops are opening, up to 30% capacity, queuing system in car parks, masks compulsory in public spaces and thermometers at entrances.
Small family gatherings allowed at home (no more than 5 people).
Some sectors allowed back to work (including mine), offices to be no more more than 30% occupied.
All entertainment venues remain closed, bars and restaurants open with restricted capacity, buffet food still banned.
Curfew from 10pm-6am
Previous restrictions (which lasted three weeks) were much tighter than UK - needed permission from police to go out, and all shops were closed except food shops and pharmacies.
https://www.thenational.ae/uae/government/coronavirus-dubai-lifts-permit-restrictions-to-allow-shopping-and-exercise-1.1010340
Oh, and "Ramadan Kareem" to all PBers of the Islamic faith, today marks the start of the Holy Month in the muslim calendar.
Which shows that the economic impact of this is currently very uneven and it is very likely to become even more so going forward.
The more I read about the modelling, I was shocked just how out of date most of the techniques (and code) they are were utilizing was. They seem totally unaware of lots of modern ML techniques, such as Gaussian Processes (and no that isn't just fitting a Gaussian to some data).
Seems like a real lack of multi-disciplinary cross pollination of ideas has gone on.
The UW model is an absolute shit show on so many levels...and that is one apparently the US government thinks is good.
https://www.theguardian.com/world/2020/apr/24/vodafone-exec-5g-coronavirus-conspiracy-theory-video-revealed-pastor-luton-jonathon-james
The moment schools reopen as normal I will start taking them and not a day later. But it would be irresponsible to take them today when I have an alternative solution.
"Johnson’s tragedy is that he has no safe option
Every political decision means weighing costs and benefits but rarely are the choices as grim as those the PM faces no" (£)
https://www.thetimes.co.uk/article/johnsons-tragedy-is-that-he-has-no-safe-option-3qgq9kgfn
The process by which is is relaxed, however, is a much more political process, with ministers needing to weigh up the various scientific, economic and health factors (including health factors caused by the lockdown) to come to a conclusion.
"He claims to have at one point advised central bankers in the Congo and Bangladesh on cryptocurrencies and he has posed with the South African president, Cyril Ramaphosa. In 2018 he was working as an economic adviser for a Zimbabwean opposition party, urging it to save the economy using Bitcoin-type products pegged to diamond deposits through blockchain technology."
https://twitter.com/DHSCgovuk/status/1253688601637986304?s=20
Seems like a real lack of multi-disciplinary cross pollination of ideas has gone on.
The UW model is an absolute shit show on so many levels...and that is one apparently the US government thinks is good.
Yes, the first thing that came to mind was to create an ML model with all of those inputs I mentioned and train it with the existing data we have from end of March to middle of April and then see how well it predicts the end of March outcomes. I'm really shocked that no one is doing this, the modelling teams seem stuck in the dark ages.
Most private schools (outside the top 'public' schools) operate as charities and don't have a lot of profit margin, savings or endowments.
- Stockholm
- Rest of Sweden
... which marches well with the suggestion that the local population density really has to be taken into account.
As of today, Stockholm passed the figure of 500 deaths per million.
(1,192 deaths in a population of 2.377 million).
The rest of Sweden has a figure of 122 deaths per million (960 deaths out of 7.85 million)
Quite a difference.
Edit: totally bollocksed up the blockquotes there. Not sure who's that quote belongs to.
And due to their widespread use, highly optimized libraries are available for most of them, if nothing else to build out from.
Even if his method is sound, I bet it runs orders of magnitude slower than it should, because it won't have been developed with all the available advances.
Not creating code to make use of number crunching on a GPU these days is just dereliction of duty.
(I may be being unfair, but it's a theory.)
S. Korea managed to control their outbreak with around 20k a day, which suggests it's not all about the raw numbers.
Surely it's more likely that the falling off is more down to government and people's raction to it?
If done competently, then the political element ought to be moot.
In the eyes of the media and opposition, meeting (or missing) that target will define the whole government response to the pandemic.
Comes back to what I said earlier about explaining in a grown up manner what you're trying to do; what are the constraints, what the challenges, and what the possibilities. They really haven't bothered, which gives a degree of concern that they don't really have it worked out.
I am fairly certain it was just driven by the reports at the time that Germany can do 100k a day (although nobody fully knows these figures, as they aren't centrally reported). And of course, they can now probably do 200k a day now, so the criticism will still be, look they can do more.
Would have been better to lay out a number that allowed certain things to happen i.e everybody in hospital getting them, NHS getting them, etc.
Also...big thing...it isn't about the amount of tests, it is speed of processing. You have to get these conducted and results reported within 24hrs with this CV bastard, 2-3 days is too long.
The way I'd do it, is to have the military take over 1,000 car parks, doing 100 tests each over the course of each day. Probably only needs a couple of people trained at each site, plus the logistics of getting the tests to labs. Perhaps National Blood Service could also help, they're pretty good at medical logistics too, and have vehicles with blue lights.
When I suggested, I popped over to the Maths department and see if I could find somebody who might be able to explain it, I got looked at as if I had just suggested we hire a leper to make the teas.
I bet if you coud spliy by smaller sub region then it would just be central Stockholm that was red hot.
We spent all this time at the beginning telling ourselves we were lucky to have two weeks more warning than Italy to react to the situation (and more time than that compared to South Korea), but we didn't manage to use that time to reduce the peak rate of infection.
As a first step, PHE will be linking thousands of existing health records for confirmed COVID-19 cases to gather more robust data, and I am delighted that Trevor Phillips OBE and Professor Richard Webber have agreed to provide expert independent support.
https://publichealthmatters.blog.gov.uk/2020/04/24/duncan-selbies-friday-message-24-april-2020/
- London: 491 deaths/million
- Rest of UK: 261 deaths per million
There's still a difference, but it's less stark (London has a death rate 88% higher than Rest Of UK; Stockholm County has a death rate 311% higher than the rest of Sweden).
Oh. Stockholm County has a worse death rate per capita than the worst region in the UK.
For example, running a GLM on a motor insurance book with 10 years of 500k policies (ie 5m vehicle years) used to occupy the pricing team for weeks, and meant they could update the results maybe a few times a year. Now, they can virtually rerun it in real time every time a new claim comes in, which frees them up to do more interesting stuff.
Similarly, around a decade ago, we still had to roll-up the natural catastrophe exposed portfolio (say a few million properties) every month - because it had to run over the weekend. A few years back, we switched to real time analysis that could be refreshed during renewal season for each individual contract.
My point is that I don't see what the analogue is in terms of datasets, regarding pandemic modelling. If it was worth assessing the susceptibility of the entire UK population, on a line-by-line basis, then sure, new methods (and hardware) would be absolutely key. But that doesn't seem to follow logically. It feels more like the issues are good old-fashioned ones like whether the models are robust enough, and how well they've been sensitivity tested, or the extent to which they rely on incomplete or potentially inaccurate data. And, while there have certainly been developments in those areas in recent years, I'm not sure there's been quite the quantum shift that would render the older ones totally redundant (even assuming the government advisers are materially behind the curve).
Ministers say that UK testing capacity is now at 51,000 a day.
https://www.dailymail.co.uk/news/article-8252717/Coronavirus-UK-Key-workers-start-booking-Covid-19-tests-new-system.html
Even the spin number is only half what it needs to be.