[ overboard / sfw / alt / cytube] [ leftypol / b / WRK / hobby / tech / edu / ga / ent / 777 / posad / i / a / R9K / dead ] [ meta ]

/tech/ - Technology

"Technology reveals the active relation of man to nature"
Name
Email
Subject
Comment
Flag
File
Embed
Password (For file deletion.)

Matrix   IRC Chat   Mumble   Telegram   Discord


File: 1625541609199.png ( 11.01 KB , 500x500 , 5658210d02e36434a2d244af15….png )

 No.9883

Why are there no neural networks for making sure there isn't any hidden obfuscated malicious code in OSS yet?
>>

 No.9888

Neural networks are completely useless for anything other than party tricks.
>>

 No.9890

>>9883
because obfuscated code should not be in the codebase to begin with and any good code review process will screen it out. Code is supposed to be readable and the maintainers of OSS know that and won't allow obfuscated code in to begin with
>>

 No.9894

>>9883
You can do fuzzing techniques, for some degree of automation in code review, but that's pretty much it. This is useful for finding errors that can be exploited for buffer overflow attacks and so on.

Neural network machine learning is at the base level statistical brute force. You would tell it to look at strings of program code to assign a statistical probability for security vulnerabilities to code fragments. Then you stack the algorithms to assign a statistical probability for security flaws to code sequences that are made up of code fragments. You continue stacking it again and again. You will eventually try to map out the entirety of all the possible states of computer hardware and that will take too long.

Neural networks are powerful tools because computers have achieved epic scales of circuit logic that allows brute force number crunching , but in this application it's a disadvantage because now you have to proof-read something of epic scale. You need to have concept forming intelligence for understanding code.

When you learn how to play a dexterity sport like for example throwing a ball in a very precise way, you have to repeat throwing the ball over and over so your brain can collect lots of data about the world and the body parts that are involved in throwing. The brain takes that data and spits out a skill, and in some ways that is similar to what happens in machine learning. However even though you know how to throw the ball you will not gain conceptual understanding of ball aero dynamics or ballistic trajectories.

The real effort to use automation to make code more secure is by making compilers more intelligent. The basic premise is to build a more advanced stack of programming tools that will not let you make insecure code in the first place. This is more in line with how we figured out regulations for buildings to prevent houses from collapsing. What is needed for this is forensic analysis of the actual hacker exploits that are being used. From that we can build general code building regulations into the compilers that prevent exploitable code, like regulation rules prevent unsound construction design.
>>

 No.9899

>>9883
because I will eventually poison it to accept my shitty patches.

>>9894
I don't think software 'engineering' can be regulated as linearly in other engineering disciplines. Laws of physics and heuristics humanities gathered are static whereas exploit writers are always adapting to changing environment. They are always interested in what existing tools and defense mechanisms cannot do not in what it prevents. Software engineering actually have lot more in common with 'financial engineering' when it comes to regulatory challenges and we all know how responsible they are.

Security through correctness has always been the way but even after Microsoft's decade worth investment in formal verification tools we do not have killer language or toolchain that is reasonably productive enough to be adopted in mass scale. Rust could fit the bill but I am skeptical of it for obvious reasons.
>>

 No.9900

>>9899
>I don't think software 'engineering' can be regulated as linearly in other engineering disciplines. Laws of physics and heuristics humanities gathered are static whereas exploit writers are always adapting to changing environment.
Wait a minute, the compiler optimizations that reduce the vulnerability to exploits don't have to be linear regulation, you are taking my building-code metaphor way too literally.

>Software engineering actually have lot more in common with 'financial engineering' when it comes to regulatory challenges and we all know how responsible they are.

No, lawyers write code that is executed by judges, bureaucrats and functionaries, not processors. Brains are not pure logic interpreters. Legal code does not work like software.

>>9899
>Security through correctness has always been the way but even after Microsoft's decade worth investment in formal verification tools we do not have killer language or toolchain that is reasonably productive enough to be adopted in mass scale.
I know about MS's code quality project, but you can't expect a big soul crushing corporation to foster real innovation. There is no way anything new could pass through the membranes of corporate administrations, too many people have made careers out of managing the half broken mess.
>Rust could fit the bill but I am skeptical of it for obvious reasons.
Yes that is the popular example but there's no reason other compilers can't have functionality of this type added. or you could even try to have a pre- compiler as a separate insert into the tool-chain.
>>

 No.9901

>>9899
>>9899
>we do not have killer language or toolchain that is reasonably productive enough to be adopted in mass scale
what ? powerful language that can be proven formally like LISP exists, they're just an unnecessary headache for most applications, and proving programs simply cost way too much when you can just fix it when you see its broken.
We will never, ever have a "reasonably productive enough" way to produce formally verified code, because its way too hard, long and costly.
>>

 No.9902

Don't certain sectors (automotive, locomotive, aerospace, medical, etc.) have their own "building code" style regulations? I did an internship at a company that worked on embedded software for railway signal control and they did have a very specific coding style and some functional requirements like everything had to be redundant. But I am not sure if these were state mandated or just the railway company wanted it this way.
>>

 No.9905

>>9900
>lawyers are not computers and codes are not legislature
I'm specifically talking about defensive programming and how is it similar to law makers failure to come up with sound plan to restrain hedge funds and other market makers not how human minds compare with processor or how legal frameworks can be compared with well defined runtime environment that runs binaries. Beside I said 'financial engineering' as in quants who have to price derivatives why are you even writing this? Wind will not blow in specific airflow with malicious intent to collapse bridge you built but rest assured traders competing with you is looking at everything you do waiting for opportunity to exploit weakness in your model or execution plan.

>MS rant

I'm not talking about their code quality project. I am pointing at their research efforts and its results on functional programming (their ocaml dialect F#), strong types (typescript) and formal verification framework (F* and obviously Z3). They all deserve some amount of criticism but just shrugging them away as corporatism is impotent.

>>9901
Well this is new. When did they invent automated tools to verify common lisp source code? how did they deal with macros?

Unique IPs: 6

[Return][Catalog][Top][Home][Post a Reply]
Delete Post [ ]
[ overboard / sfw / alt / cytube] [ leftypol / b / WRK / hobby / tech / edu / ga / ent / 777 / posad / i / a / R9K / dead ] [ meta ]
ReturnCatalogTopBottomHome