Signals are asynchronous software interrupts that usually happen due to users or utilities sending signals to processes, hardware exceptions within applications, and conditional statements within applications. Most applications do not handle signals because programmers are told "error checking" means checking return values, e.g. does a function return 0 or -1?
When an application doesn't handle signals, it relies on the kernel to do what it thinks is best for an application and the application's data after an error occurs. Sometimes an error is returned to the user, other times the problem is ignored. That's because kernels react to categorized errors, not to specifc errors caused by every imaginable application.
Some interesting and hard to find performance issues arise when applications do not handle signals, such as an accumulation of lost i-nodes, race conditions between statements, random data corruption and other unexpected behavior. A lot of performance issues are avoidable by assigning default actions to caught signals.
I think the biggest problem with handling signals is portability. The number and type of signals available vary across operating systems. For example, most Linux operating systems do not have a default value assigned to the NULL signal and have between 1 and 31 signals. Whereas, Unix has a default value of 0 assigned to the NULL signal and can have more than 255 signals.
When an application doesn't handle signals, it relies on the kernel to do what it thinks is best for an application and the application's data after an error occurs. Sometimes an error is returned to the user, other times the problem is ignored. That's because kernels react to categorized errors, not to specifc errors caused by every imaginable application.
Some interesting and hard to find performance issues arise when applications do not handle signals, such as an accumulation of lost i-nodes, race conditions between statements, random data corruption and other unexpected behavior. A lot of performance issues are avoidable by assigning default actions to caught signals.
I think the biggest problem with handling signals is portability. The number and type of signals available vary across operating systems. For example, most Linux operating systems do not have a default value assigned to the NULL signal and have between 1 and 31 signals. Whereas, Unix has a default value of 0 assigned to the NULL signal and can have more than 255 signals.
Do you have a suggestion about how to improve this blog? Let's talk about it. Contact me at David.Brenner.Jr@Gmail.com or 720-584-5229.
Comments
Post a Comment
Comments to this blog will be reviewed within 72 hours. No trolling please