[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: deviations tagging



On 6/15/22 09:19, Roger Pau Monné wrote:
On Tue, Jun 14, 2022 at 03:47:12PM -0700, Stefano Stabellini wrote:
Hi all,

Roberto was suggesting to use the following different categories for
tagging deviations. We could pick any "TAG" we like for the in-code
comments (or other tagging systems).

I am also CCing the MISRA C team to give them early visibility on this.
Feel free to provide early feedback if you have any. The plan is to
discuss it further during the next fusa-sig call and come up with a more
detailed proposal (including the actual tags, how to use them and more)
for xen-devel next.

Cheers,

Stefano



adopted

    The report should be considered originated by adopted code without any
    contribution of native code to the report.

safe

    The report is correct but the specific behavior is safe under every
    aspect assumed to be covered by the guideline.

relied

    The report is correct but the rule concerns exclusively "developer
    confusion" or readability matters that are not relevant for adopted code,
    which is assumed to work as is and it is not meant to be read, reviewed
    or modified by human programmers.  To be used for adopted code only.

false-positive

    In the opinion of the developer the violation report is not correct
    and the problem has been notified to the tool provider.
    To be used only for violation reports.

Do we want to tag false positives?  There's no benefit at all from our
code base tagging false positives, I think those should get fixed in
the checker tool, or otherwise marked as false positives somewhere
else (ie: in the tool itself).

A false positive is a "definite violation" report issued by a tool which
turns out not to be real.  (Note that a "possible violation" report,
sometimes called a "caution" or an "orange", cannot be a false positive
because it is not a positive: not all tools make this distinction.)

False positives will be reported, by all tools.  It is not a simple matter
of the tool being sound or defective: the MISRA coding standards are human
artifacts with ambiguities and defects.  Issues  of the form "is this a 
violation?"
can be in the agenda of the MISRA working groups for years before a final
decision is made.  During this period, some tools will implement one
interpretation and other tools will implement a different interpretation.
(If this surprises you, there are known defects of the C standard that,
after 20+ years, still nobody is sure how to solve.)

So, a false positive might sit there for ages, and the benefit for your
code base in tagging it is that you need not reanalyze it over and
over again.

compliant

    The developer can prove that the possible non-compliance shown by
    caution report cannot happen in any situation and can motivate such
    claim.  To be used only for caution reports.

false-negative

    The developer has found a non-compliance not shown by the tool and the
    problem has been notified to the tool provider.

I'm also not sure tagging false-negatives is helpful either, specially
if we want to consider this tag system is not bound to any specific
tool.  What could be a false negative to a specific checker tool might
not be to another, and hence the tag would cause confusion.  False
negatives should be tagged like any other violation, ignoring the fact
a specific checker tool hasn't been able to spot it.

Also I think the usage of 'report' in the descriptions is confusing.
AFAICT this is supposed to mean tags are added in reaction to reports
by checker tools, but what about deviations that are find by humans,
there's no 'report' in that case likely to refer to.  The language
seem to be focused against a tags being a reaction to a report from
checker tools.

I am not sure what you mean by "deviation that are found by humans".
I guess you mean "violations that are found by humans."
This is indeed rare, because humans are not very good at spotting
MISRA violations, but once they do, tagging the occurrence is helpful
during the period in which you and the tool vendor (and maybe the
MISRA C working group if there is no consensus on whether there
is a violation there or not) decide what to do about it.

Kind regards,

  Roberto




 


Rackspace

Lists.xenproject.org is hosted with RackSpace, monitoring our
servers 24x7x365 and backed by RackSpace's Fanatical Support®.