本文目录: 当下,有许多主流的开源深度学习框架供开发者使用。 主要包括TensorFlow、PyTorch、Keras、Caffe等。 下面是对这几种框架的详细介绍和对比: 一、TensorFlow深度学习框架TensorFlow(谷歌出品): TensorFlow 是最受欢迎和广泛使用的深度学习框架之一。
Tensorflow有三种计算图的构建方式:静态计算图,动态计算图,以及Autograph. TensorFlow2.0时代,采用的是动态计算图,即每使用一个算子后,该算子会被动态加入到隐含的默认计算图中立即执行得到结果,而无需开启Session。 如果需要在TensorFlow2.0中使用静态图,可以使用@tf.function装饰器,将普通Python函数转换成对应的TensorFlow计算图构建代码。运行该函数就相当于在TensorFlow1.0中用Session执行代码。
TensorFlow深度学习例子 以下是一个使用TensorFlow保存当前训练模型并在测试集上进行测试的样例代码:
代码运行结果: Epoch 1: loss = 0.33758699893951416 Epoch 2: loss = 0.11031775921583176 Epoch 3: loss = 0.09063640236854553 Epoch 4: loss = 0.0888814628124237 Epoch 5: loss = 0.08867537975311279 Epoch 6: loss = 0.08860388398170471 Epoch 7: loss = 0.0885448306798935 Epoch 8: loss = 0.0884876698255539 Epoch 9: loss = 0.0884314551949501 Epoch 10: loss = 0.08837611228227615 Epoch 11: loss = 0.08832161128520966 Epoch 12: loss = 0.08826790004968643 Epoch 13: loss = 0.08821499347686768 Epoch 14: loss = 0.0881628543138504 Epoch 15: loss = 0.08811149001121521 Epoch 16: loss = 0.08806086331605911 Epoch 17: loss = 0.08801095187664032 Epoch 18: loss = 0.08796177059412003 Epoch 19: loss = 0.08791326731443405 Epoch 20: loss = 0.08786546438932419 Epoch 21: loss = 0.08781831711530685 Epoch 22: loss = 0.08777181059122086 Epoch 23: loss = 0.08772596716880798 Epoch 24: loss = 0.08768074214458466 Epoch 25: loss = 0.08763613551855087 Epoch 26: loss = 0.08759212493896484 Epoch 27: loss = 0.08754870295524597 Epoch 28: loss = 0.08750586211681366 Epoch 29: loss = 0.08746359497308731 Epoch 30: loss = 0.08742187172174454 Epoch 31: loss = 0.08738070726394653 Epoch 32: loss = 0.08734006434679031 Epoch 33: loss = 0.08729996532201767 Epoch 34: loss = 0.08726035058498383 Epoch 35: loss = 0.08722126483917236 Epoch 36: loss = 0.0871826633810997 Epoch 37: loss = 0.08714456111192703 Epoch 38: loss = 0.08710692077875137 Epoch 39: loss = 0.08706976473331451 Epoch 40: loss = 0.08703305572271347 Epoch 41: loss = 0.08699680119752884 Epoch 42: loss = 0.08696100115776062 Epoch 43: loss = 0.08692563325166702 Epoch 44: loss = 0.08689068257808685 Epoch 45: loss = 0.0868561640381813 Epoch 46: loss = 0.08682206273078918 Epoch 47: loss = 0.0867883637547493 Epoch 48: loss = 0.08675506711006165 Epoch 49: loss = 0.08672215789556503 Epoch 50: loss = 0.08668963611125946 Epoch 51: loss = 0.08665750920772552 Epoch 52: loss = 0.08662573993206024 Epoch 53: loss = 0.0865943431854248 Epoch 54: loss = 0.08656331896781921 Epoch 55: loss = 0.08653264492750168 Epoch 56: loss = 0.0865023210644722 Epoch 57: loss = 0.08647233247756958 Epoch 58: loss = 0.08644269406795502 Epoch 59: loss = 0.08641338348388672 Epoch 60: loss = 0.08638441562652588 Epoch 61: loss = 0.0863557755947113 Epoch 62: loss = 0.0863274559378624 Epoch 63: loss = 0.08629942685365677 Epoch 64: loss = 0.0862717255949974 Epoch 65: loss = 0.08624432235956192 Epoch 66: loss = 0.0862172394990921 Epoch 67: loss = 0.08619043231010437 Epoch 68: loss = 0.08616391569375992 Epoch 69: loss = 0.08613769710063934 Epoch 70: loss = 0.08611176162958145 Epoch 71: loss = 0.08608610183000565 Epoch 72: loss = 0.08606073260307312 Epoch 73: loss = 0.08603561669588089 Epoch 74: loss = 0.08601076900959015 Epoch 75: loss = 0.08598621189594269 Epoch 76: loss = 0.08596190065145493 Epoch 77: loss = 0.08593783527612686 Epoch 78: loss = 0.08591403067111969 Epoch 79: loss = 0.08589048683643341 Epoch 80: loss = 0.08586718887090683 Epoch 81: loss = 0.08584412187337875 Epoch 82: loss = 0.08582130074501038 Epoch 83: loss = 0.0857987329363823 Epoch 84: loss = 0.08577638119459152 Epoch 85: loss = 0.08575427532196045 Epoch 86: loss = 0.08573239296674728 Epoch 87: loss = 0.08571072667837143 Epoch 88: loss = 0.08568929135799408 Epoch 89: loss = 0.08566807210445404 Epoch 90: loss = 0.08564707636833191 Epoch 91: loss = 0.08562628924846649 Epoch 92: loss = 0.08560571074485779 Epoch 93: loss = 0.0855853483080864 Epoch 94: loss = 0.08556520193815231 Epoch 95: loss = 0.08554524928331375 Epoch 96: loss = 0.0855254977941513 Epoch 97: loss = 0.08550594002008438 Epoch 98: loss = 0.08548659086227417 Epoch 99: loss = 0.08546742796897888 Epoch 100: loss = 0.08544846624135971 Test loss = 0.09260907769203186 二、PyTorch深度学习框架PyTorch(Facebook开源): PyTorch 是另一个非常受欢迎的深度学习框架。
PyTorch深度学习例子 #使用 PyTorch 张量将三阶多项式拟合到正弦函数。手动实现转发 并向后通过网络:
代码运行结果: 99 2351.4306640625 199 1585.7086181640625 299 1071.2376708984375 399 725.2841796875 499 492.4467468261719 599 335.59881591796875 699 229.84210205078125 799 158.46621704101562 899 110.2466812133789 999 77.63826751708984 1099 55.56399154663086 1199 40.60529327392578 1299 30.45751953125 1399 23.5659236907959 1499 18.880510330200195 1599 15.69140911102295 1699 13.518349647521973 1799 12.035942077636719 1899 11.023494720458984 1999 10.331212043762207 Result: y = 0.030692655593156815 + 0.8315182328224182 x + -0.005294993054121733 x^2 + -0.08974269032478333 x^3 三、Keras深度学习框架Keras(谷歌): Keras(谷歌)(最初由François Chollet开发,现在为TensorFlow官方API):
指导原则
四、Caffe深度学习框架Caffe(伯克利)
五、中国深度学习开源框架状况
国际权威数据调研机构IDC发布《中国深度学习框架和平台市场份额,2022H2》报告。报告显示,百度稳居中国深度学习平台市场综合份额第一,领先优势进一步扩大。中国深度学习开源框架市场形成三强格局,框架市场前三份额超过80%。 六、几种框架的对比几种框架的对比表 目前最受欢迎的深度学习框架包括TensorFlow、PyTorch和Caffe。 据市场研究公司O'Reilly发布的《2019年AI和深度学习市场调查报告》显示,TensorFlow是最受欢迎的深度学习框架,有57.2%的受访者使用它。PyTorch紧随其后,有37.1%的受访者使用它。Caffe和Keras也很受欢迎,分别占据了16.2%和13.7%的市场份额。
这几种框架的主要特点的简单对比表
需要注意的是,这些框架各有优缺点,并且在不同的应用场景下可能有不同的最佳选择。因此,在选择框架时,建议应根据项目需求和研究方向、编程技能和个人喜好来决定,进行评估和比较,最后选择具体的框架。 七、其他统计数据.NET(5+) 用户明年希望使用的前三个选项是 .NET(5+)、.NET MAUI 和 .NET Framework (1.0 - 4.8)。.NET 偏袒性很强 在他们的社区内。
|
|