Hi all,
I wanted to update you on a discussion Franceso and his team, Stefano and I had just before XMas. The topic was the first Misra C report in “Restricted (Misra): CodecheckResultXEN-v1-2019-11-19.xlsx” which
George, Stefano and I took as a fairly accurate description of the actual issues in our codebase.
However, we misinterpreted the purpose and accuracy of the report. The intention was primarily to provide some data to inform standard’s tailoring, but not to provide a baseline to
start fixing issues. Specifically
- Having a rough idea of classes of violations wrt misra rules in a tiny SW config.
- This was meant to be used as reference for understanding main critical rules violated by the system.
- What is NOT an objective for a preliminary Mirsa evaluation analysis
Resiltech tends to use Scitools Understand tool for this, because it allows you to play with different variables and configurations for this purpose very easily. As this was somehow
missed, Stefano, George and me interpreted the results to be much more accurate than they were intended to be, primarily because
I see MISRA (or more generally) coding standards compliance as a hard community challenge and as one which can be started independently of the overall plan.
Scitools Understand: Desktop Tool
Basically, it works by dropping files and constraints into the tool (via the UI). It does not model or follow the build process. As such the tool is good for scoping and experimentation, but not for reliable
in-production compliance checking. See
https://scitools.com/features/
Note: Scitools does provide demo licenses
First analysis
(“Restricted (Misra): CodecheckResultXEN-v1-2019-11-19.xlsx”)
The analysis included ONLY C files without header files, pre-processor definitions, etc. This means that
- We are analysing too much, as pre-processor defines restricting the scope
- It also means that
only rules which do not depend on header files are in fact accurate
- Any MISRA rule which depends on language objects defined elsewhere (e.g. assembler functions, anything in header files, anything
in generated code) would come up as an issue:
and these were issues which I picked up on in the - In addition, in such a scenario, one ends up with tool warnings which as they are fixed reduces a) the false positives and b) increases
the overall number of issues. Typically, the number of issues increases faster than the number of false positives reduces
Next analysis
Once this misunderstanding was cleared up, we discussed how we could make this better. After a while we settled on Stefano providing some extra info, which is available at
https://cryptpad.fr/drive/#/2/drive/edit/9GIn7VTsT6zP8gcWUxEfDEIb/
The results of that analysis are not yet available
@Stefano: I know you pulled this manually together. Is there a way to automate this via a script or to provide instructions?
Going forward (we should briefly discuss this next week)
I think we need to somewhat separate
- Any work related to Misra C compliance for scoping and tailoring
- The main outcome for this would be the tailoring document
- From work related for Misra C compliance for community and fixing issues: there is obviously a dependency here in that we should
only fix what we have to
The kind of things I am looking for in 2, and this is kind of urgent, is
- What and how to fix: An assessment of the kind of
issues we see in the codebase and a sense on how controversial it would be to fix these
I think, I can now work with the report we have and just focus on subsets of issues: aka those which only occur in C files (e.g. such as 15.6, 15.5, 15.1, 16.3, …)
The ideal outcome would be a set of coding standard changes: this is going to be the most controversial part
- Fixing different classes of issues:
to fix specific classes of issues, we do not necessarily need proper tooling support, although having that is obviously desirable.
- Maintaining what we have fixed: One of the key problems
we will have is around tooling and checking of patches in the CI infrastructure. What we really need is
- A tool which can be integrated into our CI infrastructure:
i. This means it can’t be a tool such as Scitools Understand or a MISRA C checking as a service
ii. It has to be a tool which can be dropped into a build process and which produces parseable and human readable
output
iii. The tool would need to be integrated as if it were a new compilation toolchain
iv. We would need to run the tool for every patch series submitted: ideally highlighting *any* new Misra
issues and any removed Misra issues. in other in the tool integration, we would highlight the diff of Misra issues which is produced as a result to the diff to the codebase
v. Ideally, we would be able to switch this on per class of issue *before* the codebase has been changed
to adhere to it
- A tool which has a low rate of false positives
i. When we were discussing with Resiltech, it became apparent that this could be a major issue
ii. If the number of false positives is too high, people will start to ignore the process and whatever we did
in 2 will eventually become stale
The only way I can see that there will be progress in this area is of one of the member companies picks this task up and drives it.
Also, unless we know can solve 3, attempting 1 and 2 are almost pointless (there is value in 1)
Best Regards
Lars