PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
The language used is irrelevant. It is lack of basic data processing standards and data handling. This sort of "Throw data in and it will work" approach is amateur hour....
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Kotlin is, I think, the only language I've worked with that does actually try to stop you from writing bad code. I can can still do it though - it just requires a little more perseverance :-)
The dog's breakfast that is Javascript, on the other hand...
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
I just don't see how that's true, what is it even based on?
A Daily Mail article. Must be true.
I think it's more likely that the columns (even if that were the case) were the result of whatever software might be outputting TO the spreadsheet and/or some issues with file formats (e.g. opening and resaving!).
As you say it is impossible to think a human used 16,000 columns...
Okay but even in that case, to use a spreadsheet as your source of truth compared to the/a db is nuts.
I meant output by the local system for transfer to the national system. Obviously that isn't the way you SHOULD do things, but not impossible to imagine.
Or they could write an API
Right. But when presented with lab software that has "Export to CSV", I am sure that option seems mighty tempting.
Just my hypothesis for a believable course of events here.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
I just don't see how that's true, what is it even based on?
A Daily Mail article. Must be true.
I think it's more likely that the columns (even if that were the case) were the result of whatever software might be outputting TO the spreadsheet and/or some issues with file formats (e.g. opening and resaving!).
As you say it is impossible to think a human used 16,000 columns...
Okay but even in that case, to use a spreadsheet as your source of truth compared to the/a db is nuts.
I meant output by the local system for transfer to the national system. Obviously that isn't the way you SHOULD do things, but not impossible to imagine.
But upload to the national system would either be utilising csv.reader or openpyxl, both will read through lines not columns. At least whenever I've used them.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Kotlin is, I think, the only language I've worked with that does actually try to stop you from writing bad code. I can can still do it though - it just requires a little more perseverance :-)
The dog's breakfast that is Javascript, on the other hand...
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
Nobody enjoys unit tests, now you're just being dishonest
Today is our 35th wedding anniversary. We are out for afternoon tea at the Old Course hotel later. The planned walk is looking a little problematic, however. Some minor roads are closed with flooding around here.
Congrats! - Squeeze in a few holes too?
Lord no, what a waste of time that would be. I find golf just beyond tedious. The only good things about it are the walk and the outdoors.
The sandpit things on my local course are excellent for jumping my CRF250.
You go to extraordinary lengths to achieve a certain level of popularity, don't you? Remarkable.
Leave Dura-ace alone - he`s comedy gold.
When I was 18 me and my mate did donuts on a golf course in his dad's W115 220 automatic. It open diffed and blew up the torque convertor on the second loop. His dad was livid and sent him to Sunderland Polytechnic as punishment.
Wow! Those in-house Merc. autoboxes were bulletproof.
High rpm with relatively light load is an excellent way to destroy an auto as the output shaft speed can drop to zero in an instant while the input shaft is still raging..
I'll bear that in mind next time I autotest the wife's c200, 9 speed Getrag auto.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Definitely not my experience - I find Python developers tend to be some of the best developers out there, with the exception of some hipster languages like Haskell that hardly anybody builds anything useful in.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
The problem with Excel is that people who don't know what they're doing can use it.
Yup, that's both the strength and the weakness of Excel, it explains why the entire world economy runs on it, and also why the world economy crashes from time to time.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
I just don't see how that's true, what is it even based on?
A Daily Mail article. Must be true.
I think it's more likely that the columns (even if that were the case) were the result of whatever software might be outputting TO the spreadsheet and/or some issues with file formats (e.g. opening and resaving!).
As you say it is impossible to think a human used 16,000 columns...
Okay but even in that case, to use a spreadsheet as your source of truth compared to the/a db is nuts.
I meant output by the local system for transfer to the national system. Obviously that isn't the way you SHOULD do things, but not impossible to imagine.
But upload to the national system would either be utilising csv.reader or openpyxl, both will read through lines not columns. At least whenever I've used them.
Well, I think we can agree that at lease on bad decision was made. We just have to speculate on which one.
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
Therefore can I conclude that no one has an accurate figure for cases over the weekend?
And still 'Green' types will be unhappy, because it won't be socialism.
Whatever you do, don;t get a smart meter.
Energy companies want to use them to black you out when the above proposals don;t produce nearly enough for our needs.
Is there any conspiracy theory you won't go for?
Much as it pains me to say so, I actually agree. The fine print in the smart meter stuff is so that the energy companies can reduce or shut off your supply if they need shape demand. One of those "We'll never need to do this, but strangely we insist upon it" things.
So the spreadsheet they were using for the results reached its maximum size and simply excluded all the results that followed.
Epic fail.
WTAF? Who works with large datasets and doesn't know that Excel has a maximum file size? I despair.
Who works with large datasets in Excel?!?
HMG.
It's not a particularly large dataset. Not by modern standards.
Weaning organisations off Excel is hard. The whole planet in infested with the stuff.
There's nothing wrong with Excel, providing you actually need a spreadsheet. The trouble is far too many organisations use Excel as a sort of catch-all tool for scripting, as a database, or a contacts directory, and even I've heard of it being used for writing documents rather than using Word.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Definitely not my experience - I find Python developers tend to be some of the best developers out there, with the exception of some hipster languages like Haskell that hardly anybody builds anything useful in.
Maybe I am biased by all the quans and data scientists scribbling in it.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
So the spreadsheet they were using for the results reached its maximum size and simply excluded all the results that followed.
Epic fail.
WTAF? Who works with large datasets and doesn't know that Excel has a maximum file size? I despair.
Who works with large datasets in Excel?!?
HMG.
It's not a particularly large dataset. Not by modern standards.
Weaning organisations off Excel is hard. The whole planet in infested with the stuff.
There's nothing wrong with Excel, providing you actually need a spreadsheet. The trouble is far too many organisations use Excel as a sort of catch-all tool for scripting, as a database, or a contacts directory, and even I've heard of it being used for writing documents rather than using Word.
As a tool that generates presentations, I certainly didn't do that
CCS is a waste of time and money on its own, such as the US likes to do so it can mildly greenwash itself. As part of a reasonably integrated system it has some value as a retrofit to gas plants and the like which might need to run for another decade. It's going to depend how much CCS is in this plan.
And still 'Green' types will be unhappy, because it won't be socialism.
Oh on your bike with the culture war today. This is objectively good news. There's always more to be done but one of the few things I think this government has twigged is that money is to be made in decarbonisation.
Wasn´t all that set up by the Lib Dem ministers during the Coalition Government?
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
Nobody enjoys unit tests, now you're just being dishonest
I've worked with a few developers who liked test driven development.. But the idea that you could complain when forced to write tests for your code......
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Matt Parker's book highlights a study of data into Enron's spreadsheets, which concluded that 42.2% included no formulae at all - that is to say, Excel was being used as a desktop publisher (by output at least). The rest were about as bad, but you should read the book yourself:
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
Nobody enjoys unit tests, now you're just being dishonest
I've worked with a few developers who liked test driven development.. But the idea that you could complain when forced to write tests for your code......
I do a few hours of screaming before setting up for a day of test unit dev
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
I blame the "Agile development" more than a language. Too many took it to mean "no testing needed" and too many managers took it to mean "shorter development time = lower costs". Goodbye testing, let the users do it.
The initial concept of small, easily implementable code changes that required simple, minimal tests was fair enough, but too many saw a bandwagon with a brilliant excuse to do less work.
Interesting that one of the creators of Agile, Dave Thomas, posted a YT video declaring "Agile is dead" and saying that it did not turn out the way he expected.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
In some CSV libraries, you can dynamically read the header row - so instead of expecting a fixed set of headings, you can deal with a varying column set.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
We had a 2 billion+ records in a db table, I think if you tried to export to a CSV Oracle would send somebody round to shout at you
JS is literally designed so that you can write crap and it will still attempt to do something with it.
Not quite. JS is crap because they subsumed so many versions of it into the ECMA standard years ago and the browsers never made a brilliant job of implementing it.
Things have improved somewhat since, but JS is not my favourite language.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
I once read in a two-billion line CSV file, admittedly split to avoid file size issues. The computer did not enjoy the experience. That was quickly converted into a binary format.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
In some CSV libraries, you can dynamically read the header row - so instead of expecting a fixed set of headings, you can deal with a varying column set.
Presumably though that's an override or a different function being used, it's not the default behaviour? So as to my point that if you were doing this, you would know
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
I blame the "Agile development" more than a language. Too many took it to mean "no testing needed" and too many managers took it to mean "shorter development time = lower costs". Goodbye testing, let the users do it.
The initial concept of small, easily implementable code changes that required simple, minimal tests was fair enough, but too many saw a bandwagon with a brilliant excuse to do less work.
Interesting that one of the creators of Agile, Dave Thomas, posted a YT video declaring "Agile is dead" and saying that it did not turn out the way he expected.
I remember at a previous job we used agile for all our development. Including FPGA development (free nerd hat if you know what that stands for). Long and short of it is that FPGAs are a reprogrammable circuit that you can do digital logic on. They're brilliantly fast and flexible but take a bloody long time to code on. Someone thought that we could agile this, which is like saying you can steer an oil tanker with an outboard. I spent every standup saying "I'm doing the same thing as yesterday, writing testbenches and hardware description code. This hasn't changed".
JS is literally designed so that you can write crap and it will still attempt to do something with it.
Not quite. JS is crap because they subsumed so many versions of it into the ECMA standard years ago and the browsers never made a brilliant job of implementing it.
Things have improved somewhat since, but JS is not my favourite language.
JS is literally designed so that you can write crap and it will still attempt to do something with it.
Like silently evaluating 2 + "3" to get "23" and then silently converting that to 23!
Have you seen that presentation, as well?
What presentation? I just made it up on the spot, having been caught out by JS's silent type conversions just recently (while writing a CSGO skin withdraw bot for my lad).
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
We had a 2 billion+ records in a db table, I think if you tried to export to a CSV Oracle would send somebody round to shout at you
Lol! We had someone try and download a 101m record table with around 30 fields into a CSV from GCP. He got shouted at by the sysadmin.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
I blame the "Agile development" more than a language. Too many took it to mean "no testing needed" and too many managers took it to mean "shorter development time = lower costs". Goodbye testing, let the users do it.
The initial concept of small, easily implementable code changes that required simple, minimal tests was fair enough, but too many saw a bandwagon with a brilliant excuse to do less work.
Interesting that one of the creators of Agile, Dave Thomas, posted a YT video declaring "Agile is dead" and saying that it did not turn out the way he expected.
I remember at a previous job we used agile for all our development. Including FPGA development (free nerd hat if you know what that stands for). Long and short of it is that FPGAs are a reprogrammable circuit that you can do digital logic on. They're brilliantly fast and flexible but take a bloody long time to code on. Someone thought that we could agile this, which is like saying you can steer an oil tanker with an outboard. I spent every standup saying "I'm doing the same thing as yesterday, writing testbenches and hardware description code. This hasn't changed".
Field Programmable Gate Array, from recollection they are used in GPUs (?)
The thing with this testing data, by big data standards, it isn't really big data. It is trivial size for modern IT setups. Think how much new data from just orders Amazon must process every day, in fact many enterprises.
JS is literally designed so that you can write crap and it will still attempt to do something with it.
Like silently evaluating 2 + "3" to get "23" and then silently converting that to 23!
Have you seen that presentation, as well?
What presentation? I just made it up on the spot, having been caught out by JS's silent type conversions just recently (while writing a CSGO skin withdraw bot for my lad).
It's the equals that catch me out, triple vs double, quick! What's the difference
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
That's consistent with what I've said.
Yes - the missed cases have been added as back dated data.
So for the days that they were added on, the reporting-day numbers are much higher than they should/would be. As in "today we added x cases to the data".
They are visible in the by specimen date data as well - assigned to the correct days.
@MaxPB I've a bit bemused about this XLSX vs CSV issue. An XLSX is a binary file (?), completely different to a CSV, so I get that Python might not understand and will try to interpret the file regardless but surely it must have some kind of verbose error logging?
Ideally you would throw up an error on clicking upload that it's not a valid file format, but I've seen systems where it doesn't do anything and gives the end user no feedback on success or failure. As I said, this seems much more plausible than a file size limitation. Excel can store a million rows and literally no one uses columns for anything other than headers it's just about the stupidest idea I've heard.
If your script has been written in a rush and uses csv.reader to parse standardised CSV files it will work pretty reliably, especially in a closed system where everyone has been trained to use the system properly. It's unsurprising that this started to become an issue when third party access was granted to universities, the training probably wasn't very good and the instructions were probably ignored. I've only seen it happen about a million times.
I take the point on limiting the filesize limitation but surely any sensible dev/eng would have implemented a failsafe on the backend, which they would report to a log. You'd pick it up quite quickly I would have thought, you would be looping through and instantly through an exception because it would be nonsense compared to what the interpreter was expecting.
I would have thought the library would do this for you, in fact
Yeah, that's probably how the error was spotted, by some junior sifting through the script success logs when a manual audit was being done.
Old fogey comment alert -
I had a rule back when I was working never to trust spreadsheets unless I could reperform much of it and get the same result. It saved me from grief sometimes but on the flip side it led to me doing lots of manual work that bore no fruit other than peace of mind. I would be all at sea these days.
And still 'Green' types will be unhappy, because it won't be socialism.
Whatever you do, don;t get a smart meter.
Energy companies want to use them to black you out when the above proposals don;t produce nearly enough for our needs.
Heh. Not sure about that but I'd 100% agree with you about not getting one, they don't work properly with solar panels iirc. I've refused to be switched over at any rate.
And still 'Green' types will be unhappy, because it won't be socialism.
Please don't speak for other people out of ignorance.
My instant reaction is that there's some good, but it concentrates too much on the "high technology" side of things. Great if it works, but there's no guarantee.
Also, politicians have made promises on home insulation and carbon capture and storage for fifteen years, and there's not much to show for it.
There are missing things. I'm surprised that cycling isn't there. Johnson has a good story to tell on cycling. And sadly no mention of tidal power. I'd much rather have tidal power funding than modular reactors.
The specificity of hydrogen (presumably in part as energy storage) is worrying. It might not be the best way to store energy from wind. Government needs to find a way to encourage the market to find the option that works best, rather than trying to pick winners. Or it could just be a small amount of grant funding for development, which is fine, but feels a bit 1990s in terms of having something to absorb excess wind energy in the next 5 years.
CCS is a waste of time and money on its own, such as the US likes to do so it can mildly greenwash itself. As part of a reasonably integrated system it has some value as a retrofit to gas plants and the like which might need to run for another decade. It's going to depend how much CCS is in this plan.
It will divert a large amount of capital that could be better directed - and if this Johnsonian nonsense has any substance to it, will take on a momentum of its own, irrespective of its actual merits: “We want to lead on carbon capture and storage, a technology I barely believed was possible, but I am now a complete evangelist for.”
You'd be better building a few tidal barrages. At least they'll still be around in thirty years' time.
JS is literally designed so that you can write crap and it will still attempt to do something with it.
Like silently evaluating 2 + "3" to get "23" and then silently converting that to 23!
Have you seen that presentation, as well?
What presentation? I just made it up on the spot, having been caught out by JS's silent type conversions just recently (while writing a CSGO skin withdraw bot for my lad).
It's the equals that catch me out, triple vs double, quick! What's the difference
Doubles is equality and triples is identity? Anyway, I learned enough to always use triples.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
We had a 2 billion+ records in a db table, I think if you tried to export to a CSV Oracle would send somebody round to shout at you
Lol! We had someone try and download a 101m record table with around 30 fields into a CSV from GCP. He got shouted at by the sysadmin.
I wonder if these are the kinds of people that put milk into the bowl before the cereal
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
Yeah in twenty odd years of working with data (as an applied Economist not as a 'proper' data scientist) I have never seen cases added as columns. It's very weird, especially once you know how Excel is structured and assuming you know you will be reading the data into Excel.
GOP 2.9 now. 34.5% implied probability. I`d still favour backing Dems at these prices given 538 and The Economist predictions. The latter has Reps at 11% chance.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
Yeah in twenty odd years of working with data (as an applied Economist not as a 'proper' data scientist) I have never seen cases added as columns. It's very weird, especially once you know how Excel is structured and assuming you know you will be reading the data into Excel.
I've seen it a few times, typically picking up spreadsheets from small businesses created by one person ad hoc who no longer maintains it but it is still on-going. Not from major ones.
Its something I'd expect from a one man amateur not a major organisation.
The PHE system may be, er, sub-optimal, but I’m amused by the idea that all these journalists tweeting about “and they used an Excel spreadsheet!!!” actually have the foggiest idea what they are talking about. I suspect that most of them struggle enough to handle or use a world processor competently.
People in the public sector don’t have large professional dedicated IT teams supporting them. That’s not good, but it’s just reality. So they will build their own reporting systems based on what they use on a daily basis. Which in most cases will be Excel.
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
That's consistent with what I've said.
Yes - the missed cases have been added as back dated data.
So for the days that they were added on, the reporting-day numbers are much higher than they should/would be. As in "today we added x cases to the data".
They are visible in the by specimen date data as well - assigned to the correct days.
This is by specimen date yesterday -
This is by specimen date 2 days ago -
Not quite increasing at the rate of the infamous "not a prediction", but rather concerning none the less.
And still 'Green' types will be unhappy, because it won't be socialism.
Whatever you do, don;t get a smart meter.
Energy companies want to use them to black you out when the above proposals don;t produce nearly enough for our needs.
Is there any conspiracy theory you won't go for?
Much as it pains me to say so, I actually agree. The fine print in the smart meter stuff is so that the energy companies can reduce or shut off your supply if they need shape demand. One of those "We'll never need to do this, but strangely we insist upon it" things.
More likely though is that smart meters get hacked by some group who then have fun turning off peoples power.
Having worked on a government it project, I wrote the road routing engine for the Dft transport portal, I can well believe this lunacy. One complaint we got from the civil service was the road routing was producing routes across fields. Turned out that my back end was using ordinance survey data as specified by the dft but the front end website was using different and less complete mapdata to plot the route provided also as specified by the dft.
CCS is a waste of time and money on its own, such as the US likes to do so it can mildly greenwash itself. As part of a reasonably integrated system it has some value as a retrofit to gas plants and the like which might need to run for another decade. It's going to depend how much CCS is in this plan.
It will divert a large amount of capital that could be better directed - and if this Johnsonian nonsense has any substance to it, will take on a momentum of its own, irrespective of its actual merits: “We want to lead on carbon capture and storage, a technology I barely believed was possible, but I am now a complete evangelist for.”
You'd be better building a few tidal barrages. At least they'll still be around in thirty years' time.
CCS seems to be included in these things as a sop to various lobbies.
In the UK it is of the form of "Show us some vaguely working tech and we might give you a grant".
The lobby is from the North Sea oil producers.....
So the spreadsheet they were using for the results reached its maximum size and simply excluded all the results that followed.
Epic fail.
WTAF? Who works with large datasets and doesn't know that Excel has a maximum file size? I despair.
Who works with large datasets in Excel?!?
HMG
It's not a particularly large dataset. Not by modern standards.
Weaning organisations off Excel is hard. The whole planet in infested with the stuff.
There's nothing wrong with Excel, providing you actually need a spreadsheet. The trouble is far too many organisations use Excel as a sort of catch-all tool for scripting, as a database, or a contacts directory, and even I've heard of it being used for writing documents rather than using Word.
I got it to work out the odds of any poker hand winning against various numbers of opponents once, a mammoth task that took ages (and is nowadays easily performed by various cheap downloadable software). I was really proud of my achievement.
The PHE system may be, er, sub-optimal, but I’m amused by the idea that all these journalists tweeting about “and they used an Excel spreadsheet!!!” actually have the foggiest idea what they are talking about. I suspect that most of them struggle enough to handle or use a world processor competently.
People in the public sector don’t have large professional dedicated IT teams supporting them. That’s not good, but it’s just reality. So they will build their own reporting systems based on what they use on a daily basis. Which in most cases will be Excel.
The same journalists who after 6 months of this crisis still regularly get the data wrong.....
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
Yeah in twenty odd years of working with data (as an applied Economist not as a 'proper' data scientist) I have never seen cases added as columns. It's very weird, especially once you know how Excel is structured and assuming you know you will be reading the data into Excel.
I've seen it a few times, typically picking up spreadsheets from small businesses created by one person ad hoc who no longer maintains it but it is still on-going. Not from major ones.
Its something I'd expect from a one man amateur not a major organisation.
Advice to any of them doing that - use the transpose function.
PHE using Excel for data tells you all you need to know about their expertise.
More common than you think - the number of systems I've seen where you can upload data by loading spreadsheets.....
That sounds so awful.
The shit I've seen. When this is over, at the next PB beers I'll do a standup routine on "IT systems - the bad, the insane and the stuff with the wrong number of dimensions"
I know way too much about the Java Apache POI library. Awesome though it is - for *generating* spreadsheets.
Python is a great scripting language. But for serious computing...
What do you mean by serious? Launching a nuclear strike or some data wrangling?
For the latter I don't think it really matters that much what language you use. What matters is how well it is written and managed, and how well the writer understands what they are doing.
It does matter. One of the problems with Python is the culture of the "developers"*. The number of times I have had to explain concepts of code structure and testing to Python and Javascript... code writers..... There is a lot of "but it runs, ship it" Python hackers out there.
*Many I wouldn't class as real developers
Surely that's just an example of the writer not understanding what they are doing?
I agree there is a certain culture around different languages, but it doesn't have to be like that. I'm sure I could write really terrible stuff in any language.
Culture in coding is often centred on languages. Changing those cultures is hard. I've have Javasacript devs complain to senior managers. Because I was making them write unit tests.
I blame the "Agile development" more than a language. Too many took it to mean "no testing needed" and too many managers took it to mean "shorter development time = lower costs". Goodbye testing, let the users do it.
The initial concept of small, easily implementable code changes that required simple, minimal tests was fair enough, but too many saw a bandwagon with a brilliant excuse to do less work.
Interesting that one of the creators of Agile, Dave Thomas, posted a YT video declaring "Agile is dead" and saying that it did not turn out the way he expected.
I remember at a previous job we used agile for all our development. Including FPGA development (free nerd hat if you know what that stands for). Long and short of it is that FPGAs are a reprogrammable circuit that you can do digital logic on. They're brilliantly fast and flexible but take a bloody long time to code on. Someone thought that we could agile this, which is like saying you can steer an oil tanker with an outboard. I spent every standup saying "I'm doing the same thing as yesterday, writing testbenches and hardware description code. This hasn't changed".
Field Programmable Gate Array, from recollection they are used in GPUs (?)
Not bad. Right definition, not quite the right application. GPUs have a huge set of parallel hardware multipliers that they use to rapidly crunch data. Multipliers are big digital structures so they're better implemented as dedicated logic. They're often used instead of FPGAs where power isn't an issue (such as in a desktop PC). FPGAs are more flexible because they're basically huge banks of memory. You can encode the truth tables of digital expressions in them in almost any combination, but they require specialist programmers who are almost always electronics engineers to make them work. City firms like them because you can co-process certain complex transactions on them for your high-speed-trading algorithm. They're also used extensively in consumer electronics and high end manufacturing. Your car will have a few, and a smart TV will have at least one.
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
Therefore can I conclude that no one has an accurate figure for cases over the weekend?
Thats the "by reporting date" graph.
The "by specimen date" graph, the only one you should be using is now accurate.
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
That's consistent with what I've said.
Apologies I think I was talking at cross purposes.
It boggles the mind even more when I think about it, that they were using columns.
It's literally easier to not use columns, why on Earth were they doing that
Any semi-competent developer would see the issue with the 16K limit. If they were receiving files in that format from elsewhere you'd ask them to change it because you'd know the limit would cause problems.
It is so incompetent, that it seems almost a conspiracy to me.
Well true but I question why anyone would choose to export the data in a format whereby one column represented one record
I also don't know how they would write a script to parse such a file.
Well it can be done - but it's clearly easier to not do it. Hence why libraries I've used expect a header row (I would imagine python is similar) and then they loop through the rest of the data row by row
Yeah any time I've ever needed to parse something like that it uses headers and then loops the process down rows (think my record is 7.7m rows in a CSV). Columns for case data is just so odd as a concept.
Yeah in twenty odd years of working with data (as an applied Economist not as a 'proper' data scientist) I have never seen cases added as columns. It's very weird, especially once you know how Excel is structured and assuming you know you will be reading the data into Excel.
I've seen it a few times, typically picking up spreadsheets from small businesses created by one person ad hoc who no longer maintains it but it is still on-going. Not from major ones.
Its something I'd expect from a one man amateur not a major organisation.
Advice to any of them doing that - use the transpose function.
I know there's lots of focus on the fucktacular way the data has been procured/transmitted/stroed but surely the bigger picture is the actual growth in cases ?
I know there's lots of focus on the fucktacular way the data has been procured/transmitted/stroed but surely the bigger picture is the actual growth in cases ?
I think we'll need to wait for 4-6 days to actually see how this shakes out.
The PHE system may be, er, sub-optimal, but I’m amused by the idea that all these journalists tweeting about “and they used an Excel spreadsheet!!!” actually have the foggiest idea what they are talking about. I suspect that most of them struggle enough to handle or use a world processor competently.
People in the public sector don’t have large professional dedicated IT teams supporting them. That’s not good, but it’s just reality. So they will build their own reporting systems based on what they use on a daily basis. Which in most cases will be Excel.
We are discussing the issue, not journalists. Many (all?) of us have experience of software eng/dev, so I would like to think our collective brains have an idea of what has gone on here.
My faith in the Government with people like Dido on board leads me to the conclusion that yes, they did something stupid like use columns rather than rows. I am sure others take different views.
But we can discuss on the basis of what we do know and then speculate outside of that, as to the causes.
So is there an actual official figure for number of cases over the past few days? Or are we just guesstimating?
The figures on the dashboard are up to date, the extra cases have all been added in at this point.
Except that pinned at the top is this rejoinder. The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
Therefore can I conclude that no one has an accurate figure for cases over the weekend?
Thats the "by reporting date" graph.
The "by specimen date" graph, the only one you should be using is now accurate.
Yes I see my error now. Basic errors seem to be spreading exponentially...
Comments
EDIT Winning party also back up.
1.52 Democrats
The dog's breakfast that is Javascript, on the other hand...
It was fine until that employee left
Just my hypothesis for a believable course of events here.
It is so incompetent, that it seems almost a conspiracy to me.
The cases by publish date for 3 and 4 October include 15,841 additional cases with specimen dates between 25 September and 2 October — they are therefore artificially high for England and the UK.
Therefore can I conclude that no one has an accurate figure for cases over the weekend?
Oh wait....
If they can't, we are f8cked.
The sums don;t add up. There just won't be enough power to go around...
https://books.google.co.uk/books?id=qYlZDwAAQBAJ&pg=PT42
The initial concept of small, easily implementable code changes that required simple, minimal tests was fair enough, but too many saw a bandwagon with a brilliant excuse to do less work.
Interesting that one of the creators of Agile, Dave Thomas, posted a YT video declaring "Agile is dead" and saying that it did not turn out the way he expected.
Its hardlty surprising then, when he behaves as if COVID ain't no thing.
Last price matched on Republican win 3.0
Things have improved somewhat since, but JS is not my favourite language.
So for the days that they were added on, the reporting-day numbers are much higher than they should/would be. As in "today we added x cases to the data".
They are visible in the by specimen date data as well - assigned to the correct days.
This is by specimen date yesterday -
This is by specimen date 2 days ago -
I had a rule back when I was working never to trust spreadsheets unless I could reperform much of it and get the same result. It saved me from grief sometimes but on the flip side it led to me doing lots of manual work that bore no fruit other than peace of mind. I would be all at sea these days.
My instant reaction is that there's some good, but it concentrates too much on the "high technology" side of things. Great if it works, but there's no guarantee.
Also, politicians have made promises on home insulation and carbon capture and storage for fifteen years, and there's not much to show for it.
There are missing things. I'm surprised that cycling isn't there. Johnson has a good story to tell on cycling. And sadly no mention of tidal power. I'd much rather have tidal power funding than modular reactors.
The specificity of hydrogen (presumably in part as energy storage) is worrying. It might not be the best way to store energy from wind. Government needs to find a way to encourage the market to find the option that works best, rather than trying to pick winners. Or it could just be a small amount of grant funding for development, which is fine, but feels a bit 1990s in terms of having something to absorb excess wind energy in the next 5 years.
“We want to lead on carbon capture and storage, a technology I barely believed was possible, but I am now a complete evangelist for.”
You'd be better building a few tidal barrages.
At least they'll still be around in thirty years' time.
Disgusting. Shocking. Do these irresponsible takeaways not understand how many people they have killed by staying open those 4 minutes?
Its something I'd expect from a one man amateur not a major organisation.
People in the public sector don’t have large professional dedicated IT teams supporting them. That’s not good, but it’s just reality. So they will build their own reporting systems based on what they use on a daily basis. Which in most cases will be Excel.
Having worked on a government it project, I wrote the road routing engine for the Dft transport portal, I can well believe this lunacy. One complaint we got from the civil service was the road routing was producing routes across fields. Turned out that my back end was using ordinance survey data as specified by the dft but the front end website was using different and less complete mapdata to plot the route provided also as specified by the dft.
In the UK it is of the form of "Show us some vaguely working tech and we might give you a grant".
The lobby is from the North Sea oil producers.....
I had a tiny pre-season bet at 151 or so. Probably won't fiddle with that.
https://twitter.com/UKDefJournal/status/1313057850776784896?s=20
That's probably the entire Royal Navy to be honest.
The "by specimen date" graph, the only one you should be using is now accurate.
Cases doubled in Manchester.
My faith in the Government with people like Dido on board leads me to the conclusion that yes, they did something stupid like use columns rather than rows. I am sure others take different views.
But we can discuss on the basis of what we do know and then speculate outside of that, as to the causes.
Basic errors seem to be spreading exponentially...