分享

Introduction to Automatic Differentiation

 quasiceo 2016-03-09

The derivative, as this notion appears in the elementary differential calculus, is a familiar mathematical example of a function for which both [the domain and the range] consist of functions.–Alonzo Church, 1941

Introduction to Automatic Differentiation

August 15, 2013 By Alexey Radul

Automatic differentiation may be one of the best scientific computing techniques you’ve never heard of. If you work with computers and real numbers at the same time, I think you stand to benefit from at least a basic understanding of AD, which I hope this article will provide; and even if you are a veteran automatic differentiator, perhaps you might enjoy my take on it.

What it is

Wikipedia said it very well:

Automatic differentiation (AD), also called algorithmic differentiation or computational differentiation, is a set of techniques to numerically evaluate the derivative of a function specified by a computer program … derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program.

This bears repeating: any derivative or gradient, of any function you can program, or of any program that computes a function, [*] with machine accuracy and ideal asymptotic efficiency. This is good for

  • real-parameter optimization (many good methods are gradient-based)
  • sensitivity analysis (local sensitivity =

    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约

    类似文章 更多