So as I am reaching the end of the year, I decided to extend the overmortality statistics from a year ago. As you might remember the overmortality was at about +13% for the entire EU. However, this year I cannot easily extend the overmortality plot because Euromomo retroactively modified their data.
On 1 January 2023 I downloaded the pooled mortality statistics. Below the some lines from there. The last column are the mortality stats
The plot above illustrates the modifications made to the data. Basically: a death now only counts for 42% of a death and each week 50 deaths were removed from the data. Or the other way around: during the pandemic every death counted twice and extra deaths were added. No idea what it is. But this type of change requires an explanation.
How does this affect the overmortality ?
After performing a regression from the new data to the old data it is possible the extend the data we had. This lead to the following overmortality plot
Compare this to the old overmortality, which had the following plot
As can be seen, the over mortality in 2022 is suddenly a lower than it was a year ago. Given that nothing else but the data was changed, we must conclude that the new data from euromomo has been altered to fit their model of reality better.
New overmortality statistics
Given that we can no longer trust the data from the euromomo, it is pointless to create a new overmortality plot. Nevertheless here it is for a window of 13 weeks
This leaves us with an overmortality of about 6%, yet as said, if we cannot rely on the veracity of the data, then this is just a ballpark figure. Not only that, we might need to assume it is a lot higher than this as there would otherwise be no reason to modify the data.
Below my simulation of the 4 cases: The yellow line is the lens. The blue pattern is the simulated interference pattern
Option A & B are the concave lenses. Options C & D the convex lenses. In figure A we see the fast phase trend in front of the bulge, with a low phase trend behind it. These are not visible in the three other options. However: the difference between A & C (concave vs convex) is not necessarily easy to spot in this simulation. To figure that out very likely the 2D interferogram has to be looked at to figure out which lines are ‘pulled straight’
All in all: great video which inspired me to write some code.
I updated the over mortality statistics I did last year. Horizontally weeks, vertically = mortality/expected-1 (%) Window width is about half a year, making the curve stable and less sensitive to time shifts.
A breakdown
march 2020 – corona
beginning 2021 – euthanasia instead of assuming you can help corona patients
I have been delving into some data analysis again. There are two pictures I would like to share to clearly demonstrate the futility of the current corona vaccines. The plots are based on excess mortality in the EU and the vaccination rate as provided by the EU.
The first plot shows the overmortality compared to the 5 years before corona (red curve). The green curve shows (on an unrelated axis) how many people got their first vaccine. What is surprising, is that the vaccine correlated with a spike in overmortality. This is to be expected because the elderly were the first to receive them. From that perspective: the vaccine functions as a weed-whacker. If the vaccine doesn’t kill grandpa, then corona probably won’t either.
The second point often heard is: the vaccines are efficient. Looking at this plot, this is difficult to confirm. The first peak, had a total overmortality of ~170’000. The second peak an overmortality of ~275’000. And the third peak, which we are only halfway, already has an overmortality of ~161’000. When that peak is over we are very likely looking at ~300’000. Thus: there is no measurable positive impact of vaccination. You might argue: ‘yes but without the vaccine it would be worse’. On the contrary, Such ‘pandemic’ tends to become endemic after a series of peaks, even without vaccination. So the lower peak amplitude we see now, would very likely have happened without the vaccine as well.
In the first plot I only looked at excess mortality towards the baseline, without adding any standard deviations. If we add 4 sigmas, we end up with pretty much the same results. Peak1 ~ 47’000 death, Peak2: ~91’000 death. Peak3: currently 65’000 death, expected ~117’000. However, in this plot it becomes even more apparent that vaccination had no measurable impact.
Any feedback that does not involve calling me names is welcomed. In particular I am interested to look further into the difference between the vaccinated vs non-vaccinated group, but it is truly hard to find any reliable data on this.
BpmDj… my brainchild… bringing in no money… So I am looking for free tools.
ClickOnce
ClickOnce is a technolog by microsoft to easily start and upgrade an application. It is remarkably similar to the packaging solution I created in Java. At start time it will check what is available and download it if the user wants so. I like this.
ClickOnce of course requires ‘code signing’ certificates, which are really difficult to make. And without shelling out money, windows smartscreen will always copmplain when the application is installed or upgraded. Yet… I will not spend 100EUR/Year just to remove that dialog.
A solution would be to use no installer, and then secretly upgrade the application behind the users back. Nevertheless, even then I will get the ‘untrusted application’ message, so I will assume BpmDj users are smart (they are) and will probably realize that it is pointless to spent so much money on something they will accept anyway.
BpmDj… my brainchild… bringing in no money… So I am looking for free tools.
In order to figure out how much information assemblies throw around, just have a look at https://www.jetbrains.com/decompiler/ It basically returns me the original source code, including all variable names and everything else that could have been thrown out. Therefore, an obfuscator is really necessary. http://www.dotnetstuffs.com/best-free-obfuscator-to-protect-net-assemblies/ had a list of interesting possibilities.
Dotfuscator (A Lead to Sell)
The microsoft site refers to ‘DotFuscator’; and let me tell you.. the community edition is bullshit. The entire thing is one big lead to sell you their product. It starts with a forced registration (you have to give a valid emailaddress). Then when you are in the application, you only see advertisment, not a lot of real useful obfuscation going on. And lastly, when I ran it on BpmDj it wasn’t even able to go through it because ofg ‘mixed assemblies’ I am sure I could set up a joint project with preemtive solutions, in which I would of course pay them, but honestly… don’t bother with this bullshit. The community edition doesn’t do what it pretends it will do.
Obfuscar (No XAML)
Obfuscar tries to map as many input names to as few output names as possible. ‘Massive overloading’ as they call it. https://code.google.com/archive/p/obfuscar/
At first glance this seems a dead end.. Last release was 11 years ago. However stack overflow posts still discuss it in 2018. Ha no, it seems to have moved to https://github.com/obfuscar/obfuscar
Amazingly enough, after getting a simply configuration it actually ran through the entrire shebang of assemblies and generated 1 output. That output could even start ! Yet it did hang at the splashscreen. Attaching a debugger showed that all threads had been started properly, so I assume either I access a dynamic resource by name (I do have some explicit invokes laying around), or the XAML bindings were seriously fucked up. This is something I should test somewhat further, because if this works we are done.
Oh the horror. The default configuration doesn’t actually obfuscate shit. All identifiers were still present, despite the fact that it claims it had a ‘mapping’. Probably it kept all public properties public as they were without renaming them.
I also figured out that the BAML/XAML tree is still stored in the assembly as it was present in the original solution, so no reordering takes places in any way. Not a total failure, but not great either because of this.
ConfuserEx (Abandonded)
Finally something that is not a landing page. Huray ! Last update… 26 January 2019.. still it might work and it is open source. The projecct has indeed be discontinued since 1. July 2016.
In any case, a run of it did behave similarly as Obfuscar. The application started and didn’t get further,probably because of the missing DLL’s. I might need to fix that problem if no obfuscator gets through it. In any case, the XAML was effectively gone after obfuscating; or the dotPeek decompiler stopped trying. I am not entirely sure what it is yet.
After spending some hours on this problem, the problem seems to be in the renaming strategy used. I am not yet sure whether I will blame confuserex or my own program, given that Obfuscar had exactly the same error as this one. Then again maybe they both are based on the same source, so they might both be suffering from the same bug.
In any case, performing a ‘none’ protection did not damage the original assembly, which is already a good sign. Also nice was that there was a debug protection in place which caused the applciation to bark when a debugger tries to connect.
Skater Light (Does not obfuscate at all)
Also Skater is a piece of software to buy. They do have a free version, named SkaterLight. Oddly enough… this feels a lot like chinese spyware. Seriously.
it worked at the first attempt. So I was a bit skeptical… I decompiled the generated assembly and lo and behold the thing was just not obfuscated whatsoever.
after installing it, it actually ran with elevated privileges (I know that because I could not read the generated assembly)
Dead Ends
Eazfuscator (not free) – Next is Eazfuscator because they seem very eager to actually deal with the WPF/XAML issue. Oh well.. forget it. Not free anymore. This is the point where I considered whether it would be possible to use a decompile tool to decompile an obfuscator, remove the licensing restrictions and continue. There is a certain beauty to this approach: if the obfuscator sucks, then we can easily do that, which makes it pointless to actually use then
CodeFort (disappeared) – was mentioned as another option which works well with XAML. Yet, latest udpate on the twitter feed was 2010 and the domain itself became a lnading page.
Agile.NET (non free)
FXProtect (disappeared)
ILProtector (not free) – has gone commercial since version 2.0.17
CodeVeil – encrypts the DLL before executing it. In the end this might be a better option than ‘obfuscating’ it. Drawback is of course that we have a single point of failure. Another drawback is that it is a chinese product and only a trial version.
In BpmDj we load objects on demand: every time a particular object is accessed we load it from the database. This process happens automatically, and is implemented through a dictionary which maps an object id to a runtime representation.
In Java, this dictionary was a WeakDictionary, which is a dictionary from which values can be removed by the garbage collector. When when they got removed and the program accessed that object again, we would load it fresh from the database. This poor man caching is not particularly good because any garbage collect will remove all loaded (but unreferenced) objects, forcing the program to reload those object again. Even if the particular object is often used.
To solve that, we could force references to stay in memory by means of a round robin queue. Every time an object is accesed it is put in the next position in the buffer. As such, we ensure that the cache keeps X instances alive.
Sadly that strategy is unable to deal with a burst of requests. Any often used object will simply be pushed out of the buffer when a batch of new objects is loaded (like for instance when the song selector opens).
To alleviate this problem, we can, with each access, gradually increase the stickiness of a cache item. This idea turned out to be fairly efficient:
every entry has a position in the buffer. Whenever the entry is hit, it moves to half its original position.
every new element is placed in the middle of the buffer.
This strategy leads to a distribution where often used elements are in front of the buffer. Lesser used elements slowly walk their way out of the buffer until they are evicted. To avoid that items become too sticky (e.g: there can be items that have just enough been accessed to never leave the buffer again), it is useful to add a random element to this
reposition an element to a random position between 0 and ratio * originalRank.
One could argue that having too many object id’s and too few actual objects would be a cause of concern, and it clearly is. Nevertheless, there often is a space tradeoff between holding on to an object and using its id.
The image shows the buffer of a cache of capacity 100, with 800 distinct element randomly accessed. The access pattern was shaped according to a power law distribution. The front of the cache are those that are more sticky than the later part of the buffer. The height of each entry indicates its priority in the emitter.
The following picture shows the difference between 3 types of cache. The first is the roundrobin mentioned earlier, the second is a cache which keeps backreferences and the elevator cache is the one implemented here.
The data on which this was ran was the retrieveal of all startup objects BpmDj need, including the opening of the song selector. The total object count was 133632, of which 70291 unique ones.
After having tested both of them extensively I can draw the following conclusion: MSTest is definitely the winner. Why ?
XUnit
buggy as hell. For a testframework this is kinda weird
very very slow
really confused about the tests that are available
No standard output. Yes I know you can redirect it, still they should not steal my debug output in the first place.
Different assertions than MSTest, and they are badly implemented at that (E.g: an assertion finding the content of a collection will simple iterate over all elements. It is truly painful to see how far computer scientists have sunken)
crashes VS2019 when in auto-hide
Talks about [Theories] and [Facts] instead of [TestMethod], just some ‘cool’ jargon and indeed far removed from reality.
MSTest
Does not have the same level of ‘we are so cool but can’t program’ fuckery as Xunit
Allthough this post is small, nobody seems to care to say how bad xUnit exactly is.
At the moment the style is created, NormalTextColor is not defined yet. And so it stays whenever later that style is applied. If we swap the NormalTextColor definition and the Style then it will be a fixed yellow.
Dynamicresources are resolved whenever necessary
If we modify the StaticResource in a DynamicResource, then that example will behave correctly, and every textblock will have a yellow foreground.
This means that any textblock within the stackpanel will be colored red, while the textblocks outside the stackpanel yet inside the window will be orange. And ifg the application.xaml is defined as in our first example, then any other window will be yellow.
It might be necessary to restyle multiple controls
Whenever a textblock is used it will have the provided style. A label however has its own foreground color defined, and so requires an extra style.
ControlTemplates provide a way to render a particular element differently. The dynamic lookup still goes from the lexical point of insertion up the logical tree. Thus the following fragment
Will render both the ‘Zhe legend’ as well as the actual content of the label using the same dynamicresource: that is they will both have the same color, even if the controltemplate was defined in a different file. (One could expect that ‘Zhe Legend’ would follow a lookup hierarchy going from the definition of the template, while the ContentPresenter would follow a different hierarchy)
The logical parent with controltempaltes
The logical parent of the contentpresenter is the controltemplate, which is the same as the control being templated. Thus if we set the template of a label to something, and then define a resources in the controltemplate (as ControlTempalte.Resources), then these resources are part of the label, and thus are visibly to dynamicresources applied to the contentpresenter.
Yet, if we place the resources to a subelement within the controltemplate, then they are not part of the label, and thus not part of the logical chain of parents from the contentpresenter.
Under the assumption that the default color has been set to red in the App.xaml, we have two ways to define a controltemplate, with two different results
Who has priority ?
Because both the ControlTemplate and the original instantiation both access the same resource dictionary frame, it is useful to figure out who has priority. The answer is: the controltempalte its resources are applied first, afterwards those defined in the actual instantiation of the control.