BP神經網絡python簡單實現

BP神經網絡的原理在網上有很詳細的說明,這里就不打算細說,這篇文章主要簡單的方式設計及實現BP神經網絡,并簡單測試下在恒等計算(編碼)作測試。?

BP神經網絡模型圖如下



BP神經網絡基本思想

BP神經網絡學習過程由信息的下向傳遞和誤差的反向傳播兩個過程組成

正向傳遞:由模型圖中的數據x從輸入層到最后輸出層z的過程。

反向傳播:在訓練階段,如果正向傳遞過程中發現輸出的值與期望的傳有誤差,由將誤差從輸出層返傳回輸入層的過程。返回的過程主要是修改每一層每個連接的權值w,達到減少誤的過程。


BP神經網絡設計

設計思路是將神經網絡分為神經元、網絡層及整個網絡三個層次。

首先是定義使用sigmoid函數作為激活函數

[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. def?logistic(x):??
  2. ????return?1?/?(1?+?np.exp(-x))??
  3. ??
  4. def?logistic_derivative(x):??
  5. ????return?logistic(x)?*?(1?-?logistic(x))??

神經元的設計


由神經元的設計圖可知,BP神經網絡可拆解成是神經元的集合。

神經元主要功能:

  1. 計算數據,輸出結果。
  2. 更新各連接權值。
  3. 向上一層反饋權值更新值,實現反饋功能。

注意到:

[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. weight_add?=?self.input?*?self.deltas_item?*?learning_rate?+?0.9?*?self.last_weight_add#添加沖量??
代碼是添加沖量功能(+ 0.9 * self.last_weight_add ), 這可以加速收斂,如果有興趣可把它去掉看看訓練結果。?

神經元設計代碼如下:

[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. class?Neuron:??
  2. ????def?__init__(self,?len_input):??
  3. ????????#?輸入的初始參數,?隨機取很小的值(<0.1)??
  4. ????????self.weights?=?np.random.random(len_input)?*?0.1??
  5. ????????#?當前實例的輸入??
  6. ????????self.input?=?np.ones(len_input)??
  7. ????????#?對下一層的輸出值??
  8. ????????self.output?=?1??
  9. ????????#?誤差項??
  10. ????????self.deltas_item?=?0??
  11. ????????#?上一次權重增加的量,記錄起來方便后面擴展時可考慮增加沖量??
  12. ????????self.last_weight_add?=?0??
  13. ??
  14. ????def?calc_output(self,?x):??
  15. ????????#?計算輸出值??
  16. ????????self.input?=?x??
  17. ????????self.output?=?logistic(np.dot(self.weights.T,?self.input))??
  18. ????????return?self.output??
  19. ??
  20. ????def?get_back_weight(self):??
  21. ????????#?獲取反饋差值??
  22. ????????return?self.weights?*?self.deltas_item??
  23. ??
  24. ????def?update_weight(self,?target=0,?back_weight=0,?learning_rate=0.1,?layer="OUTPUT"):??
  25. ????????#?更新權傳??
  26. ????????if?layer?==?"OUTPUT":??
  27. ????????????self.deltas_item?=?(target?-?self.output)?*?logistic_derivative(self.output)??
  28. ????????elif?layer?==?"HIDDEN":??
  29. ????????????self.deltas_item?=?back_weight?*?logistic_derivative(self.output)??
  30. ??????????
  31. ????????weight_add?=?self.input?*?self.deltas_item?*?learning_rate?+?0.9?*?self.last_weight_add#添加沖量??
  32. ????????self.weights?+=?weight_add??
  33. ????????self.last_weight_add?=?weight_add??

網絡層設計

管理一個網絡層的代碼,分為隱藏層和輸出層。 (輸入層可直接用輸入數據,不簡單實現。)

網絡層主要管理自己層的神經元,所以封裝的結果與神經元的接口一樣。對向實現自己的功能。

同時為了方便處理,添加了他下一層的引用。

[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. class?NetLayer:??
  2. ????'''''?
  3. ????網絡層封裝?
  4. ????管理當前網絡層的神經元列表?
  5. ????'''??
  6. ????def?__init__(self,?len_node,?in_count):??
  7. ????????'''''?
  8. ????????:param?len_node:?當前層的神經元數?
  9. ????????:param?in_count:?當前層的輸入數?
  10. ????????'''??
  11. ????????#?當前層的神經元列表??
  12. ????????self.neurons?=?[Neuron(in_count)?for?_?in?range(len_node)]??
  13. ????????#?記錄下一層的引用,方便遞歸操作??
  14. ????????self.next_layer?=?None??
  15. ??
  16. ????def?calc_output(self,?x):??
  17. ????????output?=?np.array([node.calc_output(x)?for?node?in?self.neurons])??
  18. ????????if?self.next_layer?is?not?None:??
  19. ????????????return?self.next_layer.calc_output(output)??
  20. ????????return?output??
  21. ??
  22. ????def?get_back_weight(self):??
  23. ????????return?sum([node.get_back_weight()?for?node?in?self.neurons])??
  24. ??
  25. ????def?update_weight(self,?learning_rate,?target):??
  26. ????????'''''?
  27. ????????更新當前網絡層及之后層次的權重?
  28. ????????使用了遞歸來操作,所以要求外面調用時必須從網絡層的第一層(輸入層的下一層)來調用?
  29. ????????:param?learning_rate:?學習率?
  30. ????????:param?target:?輸出值?
  31. ????????'''??
  32. ????????layer?=?"OUTPUT"??
  33. ????????back_weight?=?np.zeros(len(self.neurons))??
  34. ????????if?self.next_layer?is?not?None:??
  35. ????????????back_weight?=?self.next_layer.update_weight(learning_rate,?target)??
  36. ????????????layer?=?"HIDDEN"??
  37. ????????for?i,?node?in?enumerate(self.neurons):??
[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. <span?style="white-space:pre">????????</span>target_item?=?0?if?len(target)?<=?i?else?target[i]??
  2. ????????????????node.update_weight(target=<span?style="font-family:?Arial,?Helvetica,?sans-serif;">target_item</span><span?style="font-family:?Arial,?Helvetica,?sans-serif;">,?back_weight=back_weight[i],?learning_rate=learning_rate,?layer=layer)</span>??
  3. ????????return?self.get_back_weight()??

BP神經網絡實現

管理整個網絡,對外提供訓練接口及預測接口。

構建網絡參數為一列表, 第一個元素代碼輸入參數個數, 最后一個代碼輸出神經元個數,中間的為各個隱藏層中的神經元的個數。

由于各層間代碼鏈式存儲, 所以layers[0]操作就代碼了整個網絡。

[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. class?NeuralNetWork:??
  2. ??
  3. ????def?__init__(self,?layers):??
  4. ????????self.layers?=?[]??
  5. ????????self.construct_network(layers)??
  6. ????????pass??
  7. ??
  8. ????def?construct_network(self,?layers):??
  9. ????????last_layer?=?None??
  10. ????????for?i,?layer?in?enumerate(layers):??
  11. ????????????if?i?==?0:??
  12. ????????????????continue??
  13. ????????????cur_layer?=?NetLayer(layer,?layers[i-1])??
  14. ????????????self.layers.append(cur_layer)??
  15. ????????????if?last_layer?is?not?None:??
  16. ????????????????last_layer.next_layer?=?cur_layer??
  17. ????????????last_layer?=?cur_layer??
  18. ??
  19. ????def?fit(self,?x_train,?y_train,?learning_rate=0.1,?epochs=100000,?shuffle=False):??
  20. ????????'''''?
  21. ????????訓練網絡,?默認按順序來訓練?
  22. ????????方法?1:按訓練數據順序來訓練?
  23. ????????方法?2:?隨機選擇測試?
  24. ????????:param?x_train:?輸入數據?
  25. ????????:param?y_train:?輸出數據?
  26. ????????:param?learning_rate:?學習率?
  27. ????????:param?epochs:權重更新次數?
  28. ????????:param?shuffle:隨機取數據訓練??
  29. ????????'''??
  30. ????????indices?=?np.arange(len(x_train))??
  31. ????????for?_?in?range(epochs):??
  32. ????????????if?shuffle:??
  33. ????????????????np.random.shuffle(indices)??
  34. ????????????for?i?in?indices:??
  35. ????????????????self.layers[0].calc_output(x_train[i])??
  36. ????????????????self.layers[0].update_weight(learning_rate,?y_train[i])??
  37. ????????pass??
  38. ??
  39. ????def?predict(self,?x):??
  40. ????????return?self.layers[0].calc_output(x)??

測試代碼

測試數據中輸出數據和輸出數據一樣。測試AutoEncoder自動編碼器。(AutoEncoder不了解的可網上找一下。)

[python]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. if?__name__?==?'__main__':??
  2. ????print("test?neural?network")??
  3. ??
  4. ????data?=?np.array([[1,?0,?0,?0,?0,?0,?0,?0],??
  5. ?????????????????????[0,?1,?0,?0,?0,?0,?0,?0],??
  6. ?????????????????????[0,?0,?1,?0,?0,?0,?0,?0],??
  7. ?????????????????????[0,?0,?0,?1,?0,?0,?0,?0],??
  8. ?????????????????????[0,?0,?0,?0,?1,?0,?0,?0],??
  9. ?????????????????????[0,?0,?0,?0,?0,?1,?0,?0],??
  10. ?????????????????????[0,?0,?0,?0,?0,?0,?1,?0],??
  11. ?????????????????????[0,?0,?0,?0,?0,?0,?0,?1]])??
  12. ??
  13. ????np.set_printoptions(precision=3,?suppress=True)??
  14. ??
  15. ????for?item?in?range(10):??
  16. ????????network?=?NeuralNetWork([8,?3,?8])??
  17. ????????#?讓輸入數據與輸出數據相等??
  18. ????????network.fit(data,?data,?learning_rate=0.1,?epochs=10000)??
  19. ??
  20. ????????print("\n\n",?item,??"result")??
  21. ????????for?item?in?data:??
  22. ????????????print(item,?network.predict(item))??

結果輸出

效果還不錯,達到了預想的結果。?

問題:可測試結果中有 0.317(已經標紅), 是由于把8個數據編碼成3個數據有點勉強。 如果網絡改成[8,4,8]就能夠不出現這樣的結果。 大家可以試一下。

[html]?view plaincopy
在CODE上查看代碼片派生到我的代碼片
  1. /Library/Frameworks/Python.framework/Versions/3.4/bin/python3.4?/XXXX/機器學習/number/NeuralNetwork.py??
  2. test?neural?network??
  3. ??
  4. ??
  5. ?0?result??
  6. [1?0?0?0?0?0?0?0]?[?0.987??0.?????0.005??0.?????0.?????0.01???0.004??0.???]??
  7. [0?1?0?0?0?0?0?0]?[?0.?????0.985??0.?????0.006??0.?????0.025??0.?????0.008]??
  8. [0?0?1?0?0?0?0?0]?[?0.007??0.?????0.983??0.?????0.007??0.027??0.?????0.???]??
  9. [0?0?0?1?0?0?0?0]?[?0.?????0.005??0.?????0.985??0.007??0.02???0.?????0.???]??
  10. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.005??0.005??0.983??0.013??0.?????0.???]??
  11. [0?0?0?0?0?1?0?0]?[?0.016??0.017??0.02???0.018??0.018??<span?style="color:#ff0000;">0.317</span>??0.023??0.017]??
  12. [0?0?0?0?0?0?1?0]?[?0.006??0.?????0.?????0.?????0.?????0.026??0.984??0.006]??
  13. [0?0?0?0?0?0?0?1]?[?0.?????0.005??0.?????0.?????0.?????0.01???0.004??0.985]??
  14. ??
  15. ??
  16. ?1?result??
  17. [1?0?0?0?0?0?0?0]?[?0.983??0.?????0.?????0.007??0.007??0.?????0.?????0.027]??
  18. [0?1?0?0?0?0?0?0]?[?0.?????0.986??0.004??0.?????0.?????0.?????0.005??0.01?]??
  19. [0?0?1?0?0?0?0?0]?[?0.?????0.005??0.985??0.?????0.005??0.?????0.?????0.026]??
  20. [0?0?0?1?0?0?0?0]?[?0.005??0.?????0.?????0.983??0.?????0.006??0.?????0.015]??
  21. [0?0?0?0?1?0?0?0]?[?0.005??0.?????0.004??0.?????0.987??0.?????0.?????0.01?]??
  22. [0?0?0?0?0?1?0?0]?[?0.?????0.?????0.?????0.006??0.?????0.984??0.005??0.018]??
  23. [0?0?0?0?0?0?1?0]?[?0.?????0.008??0.?????0.?????0.?????0.006??0.984??0.027]??
  24. [0?0?0?0?0?0?0?1]?[?0.018??0.017??0.025??0.018??0.016??0.018??0.017??<span?style="color:#ff0000;">0.317]</span>??
  25. ??
  26. ??
  27. ?2?result??
  28. [1?0?0?0?0?0?0?0]?[?0.966??0.?????0.016??0.014??0.?????0.?????0.?????0.???]??
  29. [0?1?0?0?0?0?0?0]?[?0.?????0.969??0.?????0.016??0.?????0.?????0.?????0.014]??
  30. [0?0?1?0?0?0?0?0]?[?0.012??0.?????0.969??0.?????0.?????0.013??0.?????0.???]??
  31. [0?0?0?1?0?0?0?0]?[?0.014??0.014??0.?????0.969??0.?????0.?????0.?????0.???]??
  32. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.?????0.?????0.962??0.016??0.02???0.???]??
  33. [0?0?0?0?0?1?0?0]?[?0.?????0.?????0.02???0.?????0.016??0.963??0.?????0.???]??
  34. [0?0?0?0?0?0?1?0]?[?0.?????0.?????0.?????0.?????0.012??0.?????0.969??0.011]??
  35. [0?0?0?0?0?0?0?1]?[?0.?????0.014??0.?????0.?????0.?????0.?????0.016??0.966]??
  36. ??
  37. ??
  38. ?3?result??
  39. [1?0?0?0?0?0?0?0]?[?0.983??0.?????0.?????0.007??0.027??0.?????0.?????0.007]??
  40. [0?1?0?0?0?0?0?0]?[?0.?????0.986??0.004??0.?????0.01???0.005??0.?????0.???]??
  41. [0?0?1?0?0?0?0?0]?[?0.?????0.006??0.984??0.006??0.026??0.?????0.?????0.???]??
  42. [0?0?0?1?0?0?0?0]?[?0.005??0.?????0.004??0.987??0.01???0.?????0.?????0.???]??
  43. [0?0?0?0?1?0?0?0]?[?0.019??0.017??0.024??0.016??<span?style="color:#ff0000;">0.317</span>??0.017??0.018??0.018]??
  44. [0?0?0?0?0?1?0?0]?[?0.?????0.008??0.?????0.?????0.026??0.984??0.006??0.???]??
  45. [0?0?0?0?0?0?1?0]?[?0.?????0.?????0.?????0.?????0.019??0.005??0.984??0.007]??
  46. [0?0?0?0?0?0?0?1]?[?0.005??0.?????0.?????0.?????0.014??0.?????0.005??0.983]??
  47. ??
  48. ??
  49. ?4?result??
  50. [1?0?0?0?0?0?0?0]?[?0.969??0.014??0.?????0.?????0.?????0.?????0.014??0.???]??
  51. [0?1?0?0?0?0?0?0]?[?0.014??0.966??0.016??0.?????0.?????0.?????0.?????0.???]??
  52. [0?0?1?0?0?0?0?0]?[?0.?????0.011??0.969??0.?????0.?????0.012??0.?????0.???]??
  53. [0?0?0?1?0?0?0?0]?[?0.?????0.?????0.?????0.966??0.?????0.?????0.013??0.016]??
  54. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.?????0.?????0.963??0.016??0.?????0.02?]??
  55. [0?0?0?0?0?1?0?0]?[?0.?????0.?????0.02???0.?????0.016??0.963??0.?????0.???]??
  56. [0?0?0?0?0?0?1?0]?[?0.016??0.?????0.?????0.014??0.?????0.?????0.969??0.???]??
  57. [0?0?0?0?0?0?0?1]?[?0.?????0.?????0.?????0.011??0.012??0.?????0.?????0.969]??
  58. ??
  59. ??
  60. ?5?result??
  61. [1?0?0?0?0?0?0?0]?[?0.966??0.?????0.016??0.?????0.?????0.018??0.?????0.???]??
  62. [0?1?0?0?0?0?0?0]?[?0.?????0.969??0.012??0.?????0.?????0.?????0.011??0.???]??
  63. [0?0?1?0?0?0?0?0]?[?0.015??0.018??0.964??0.?????0.?????0.?????0.?????0.???]??
  64. [0?0?0?1?0?0?0?0]?[?0.?????0.?????0.?????0.968??0.013??0.?????0.?????0.013]??
  65. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.?????0.015??0.965??0.015??0.?????0.???]??
  66. [0?0?0?0?0?1?0?0]?[?0.013??0.?????0.?????0.?????0.013??0.968??0.?????0.???]??
  67. [0?0?0?0?0?0?1?0]?[?0.?????0.018??0.?????0.?????0.?????0.?????0.965??0.014]??
  68. [0?0?0?0?0?0?0?1]?[?0.?????0.?????0.?????0.018??0.?????0.?????0.015??0.967]??
  69. ??
  70. ??
  71. ?6?result??
  72. [1?0?0?0?0?0?0?0]?[?0.983??0.006??0.?????0.005??0.?????0.?????0.?????0.016]??
  73. [0?1?0?0?0?0?0?0]?[?0.006??0.983??0.?????0.?????0.?????0.005??0.?????0.017]??
  74. [0?0?1?0?0?0?0?0]?[?0.?????0.?????0.987??0.005??0.?????0.?????0.004??0.01?]??
  75. [0?0?0?1?0?0?0?0]?[?0.007??0.?????0.007??0.983??0.?????0.?????0.?????0.027]??
  76. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.?????0.?????0.987??0.005??0.004??0.01?]??
  77. [0?0?0?0?0?1?0?0]?[?0.?????0.007??0.?????0.?????0.008??0.983??0.?????0.027]??
  78. [0?0?0?0?0?0?1?0]?[?0.?????0.?????0.005??0.?????0.005??0.?????0.985??0.026]??
  79. [0?0?0?0?0?0?0?1]?[?0.018??0.018??0.017??0.017??0.017??0.017??0.025??<span?style="color:#ff0000;">0.317</span>]??
  80. ??
  81. ??
  82. ?7?result??
  83. [1?0?0?0?0?0?0?0]?[?0.969??0.?????0.?????0.?????0.014??0.?????0.?????0.015]??
  84. [0?1?0?0?0?0?0?0]?[?0.?????0.963??0.02???0.?????0.?????0.017??0.?????0.???]??
  85. [0?0?1?0?0?0?0?0]?[?0.?????0.012??0.969??0.?????0.?????0.?????0.011??0.???]??
  86. [0?0?0?1?0?0?0?0]?[?0.?????0.?????0.?????0.969??0.011??0.013??0.?????0.???]??
  87. [0?0?0?0?1?0?0?0]?[?0.014??0.?????0.?????0.016??0.966??0.?????0.?????0.???]??
  88. [0?0?0?0?0?1?0?0]?[?0.?????0.016??0.?????0.02???0.?????0.962??0.?????0.???]??
  89. [0?0?0?0?0?0?1?0]?[?0.?????0.?????0.016??0.?????0.?????0.?????0.966??0.014]??
  90. [0?0?0?0?0?0?0?1]?[?0.015??0.?????0.?????0.?????0.?????0.?????0.014??0.969]??
  91. ??
  92. ??
  93. ?8?result??
  94. [1?0?0?0?0?0?0?0]?[?0.966??0.016??0.013??0.?????0.?????0.?????0.?????0.???]??
  95. [0?1?0?0?0?0?0?0]?[?0.011??0.969??0.?????0.?????0.?????0.?????0.012??0.???]??
  96. [0?0?1?0?0?0?0?0]?[?0.014??0.?????0.969??0.015??0.?????0.?????0.?????0.???]??
  97. [0?0?0?1?0?0?0?0]?[?0.?????0.?????0.015??0.969??0.?????0.014??0.?????0.???]??
  98. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.?????0.?????0.963??0.?????0.016??0.02?]??
  99. [0?0?0?0?0?1?0?0]?[?0.?????0.?????0.?????0.013??0.?????0.966??0.?????0.016]??
  100. [0?0?0?0?0?0?1?0]?[?0.?????0.02???0.?????0.?????0.016??0.?????0.963??0.???]??
  101. [0?0?0?0?0?0?0?1]?[?0.?????0.?????0.?????0.?????0.012??0.011??0.?????0.969]??
  102. ??
  103. ??
  104. ?9?result??
  105. [1?0?0?0?0?0?0?0]?[?0.969??0.?????0.?????0.?????0.?????0.011??0.?????0.011]??
  106. [0?1?0?0?0?0?0?0]?[?0.?????0.968??0.?????0.?????0.?????0.?????0.018??0.015]??
  107. [0?0?1?0?0?0?0?0]?[?0.?????0.?????0.965??0.?????0.015??0.?????0.015??0.???]??
  108. [0?0?0?1?0?0?0?0]?[?0.?????0.?????0.?????0.966??0.018??0.016??0.?????0.???]??
  109. [0?0?0?0?1?0?0?0]?[?0.?????0.?????0.013??0.013??0.968??0.?????0.?????0.???]??
  110. [0?0?0?0?0?1?0?0]?[?0.018??0.?????0.?????0.014??0.?????0.964??0.?????0.???]??
  111. [0?0?0?0?0?0?1?0]?[?0.?????0.013??0.013??0.?????0.?????0.?????0.968??0.???]??
  112. [0?0?0?0?0?0?0?1]?[?0.018??0.014??0.?????0.?????0.?????0.?????0.?????0.965]??
  113. ??
  114. Process?finished?with?exit?code?0??

本文來自互聯網用戶投稿,該文觀點僅代表作者本人,不代表本站立場。本站僅提供信息存儲空間服務,不擁有所有權,不承擔相關法律責任。
如若轉載,請注明出處:http://www.pswp.cn/news/387818.shtml
繁體地址,請注明出處:http://hk.pswp.cn/news/387818.shtml
英文地址,請注明出處:http://en.pswp.cn/news/387818.shtml

如若內容造成侵權/違法違規/事實不符,請聯系多彩編程網進行投訴反饋email:809451989@qq.com,一經查實,立即刪除!

相關文章

golang的reflection(轉)(一)

2019獨角獸企業重金招聘Python工程師標準>>> 反射reflection 可以大大提高程序的靈活性&#xff0c;使得interface{}有更大的發揮余地反射可以使用TypeOf和ValueOf函數從接口中獲取目標對象信息反射會將匿名字段作為獨立字段&#xff08;匿名字段的本質&#xff09;…

idea教程--Maven 骨架介紹

簡單的說&#xff0c;Archetype是Maven工程的模板工具包。一個Archetype定義了要做的相同類型事情的初始樣式或模型。這個名稱給我們提供來了一個一致的生成Maven工程的方式。Archetype會幫助作者給用戶創建Maven工程模板&#xff0c;并給用戶提供生成相關工程模板版本的參數化…

datatables.js 簡單使用--多選框和服務器端分頁

說明&#xff1a;datatables是一款jQuery表格插件。感覺EasyUI的datagrid更易用 內容&#xff1a;多選框和服務器端分頁 緣由&#xff1a;寫這篇博客的原因是datatables的文檔寫的不怎么樣&#xff0c;找東西很麻煩 環境&#xff1a;asp.net mvc , vs2015sqlserver2012 顯示效…

python異常(高級) Exception

異常(高級) Exception 異常回顧:     try-except 語句 捕獲(接收)異常通知,把異常流程變為正常流程     try-finally 語句 執行必須要執行的語句.     raise 語句 發送異常通知,同時進入異常流程     assert 語句 發送AssertionError異常     with 語句 wi…

反射賦值

目前例子為NPOI Excel導入 入庫時調用 var file file1.PostedFile.InputStream;var fileExt System.IO.Path.GetExtension(file1.FileName);IWorkbook workbook;if (fileExt ".xlsx")workbook new XSSFWorkbook(file);elseworkbook new HSSFWorkbook(file);DB.D…

基于PCA(主成分分析)的人臉識別

代碼下載&#xff1a;基于PCA&#xff08;主成分分析&#xff09;的人臉識別 人臉識別是一個有監督學習過程&#xff0c;首先利用訓練集構造一個人臉模型&#xff0c;然后將測試集與訓練集進行匹配&#xff0c;找到與之對應的訓練集頭像。最容易的方式是直接利用歐式距離計算測…

從BMW Vision iNEXT 看寶馬如何進軍自動駕駛

安全很重要&#xff0c;空間也要很大&#xff0c;砍掉大量物理按鍵&#xff0c;內飾材料要環保&#xff0c;還要提供自動和主動兩套駕駛方案。這些描述僅是BMW Vision iNEXT&#xff08;下稱Vision iNEXT&#xff09;概念車的設計之冰山一角。 一款概念車當然無法完全代表未來…

CSS浮動(二)---Float

重新認識float 2.1. 誤解和“誤用” 既然提到“誤用”&#xff0c;各位看官就此想想&#xff0c;自己平日是怎么使用float的&#xff1f;另外&#xff0c;既然“誤用”加了引號&#xff0c;就說明這樣的使用并不是真正的誤用&#xff0c;而是誤打誤撞使用之后&#xff0c;帶…

Hadoop0.20.2版本在Ubuntu下安裝和配置

1、安裝JDK   &#xff08;1&#xff09;下載安裝JDK&#xff1a;確保計算機聯網之后命令行輸入下面命令安裝JDK   sudo apt-get install sun-java6-jdk   &#xff08;2&#xff09;配置計算機Java環境&#xff1a;打開/etc/profile&#xff0c;在文件最后輸入下面內容 …

云原生生態周報 Vol. 2

業界要聞 Kubernetes External Secrets 近日&#xff0c;世界上最大的域名托管公司 Godaddy公司&#xff0c;正式宣布并詳細解讀了其開源的K8s外部 Secrets 管理項目&#xff1a; Kubernetes External Secrets&#xff0c;簡稱KES。這個項目定義了ExternalSecrets API&#xff…

centos 7新機使用前操作

關閉防火墻 systemctl stop firewalld&#xff08;停服務&#xff09; systemctl status firewalld&#xff08;看狀態&#xff09; systemctl disable firewalld.service &#xff08;永久關閉&#xff09; selinux getenforce&#xff08;查狀態&#xff09; vi /etc/selinux…

ubuntu10.04+hadoop0.20.2平臺配置(完全分布式模式)

配置環境及有關工具&#xff1a;ubuntu10.04 、hadoop0.20.2 、 jdk1.6.0_29 我們的機器有三臺&#xff0c;一臺當作namenode、兩臺當作datanode&#xff1a; namenode&#xff1a;IP:192.168.0.25、機器名&#xff1a;kiddenzj &#xff08;這里的機器名要注意&#xff1a;機…

成佛、遠不止渡滄海

地之及東南&#xff0c;有一海&#xff0c;稱為“滄海”。滄海對面&#xff0c;就是仙家佛地。凡是能渡過滄海到達彼岸的人&#xff0c;就能立地成佛&#xff0c;修成正果。 于是&#xff0c;許許多多的人千里迢迢趕來&#xff0c;或乘帆船&#xff0c;或乘木筏&#xff0c;紛紛…

軟件架構演進

傳統架構到分布式架構詳解 軟件架構演進軟件架構的發展經歷了從單體架構、垂直架構、SOA架構到微服務架構的過程&#xff0c;博客里寫到了這四種架構的特點以及優缺點分析&#xff0c;個人學習之用&#xff0c;僅供參考&#xff01; 1.1.1 單體架構 特點&#xff1a;1、所有的…

hadoop0.20.0第一個例子

這是Hadoop學習全程記錄第2篇&#xff0c;在這篇里我將介紹一下如何在Eclipse下寫第一個MapReduce程序。 新說明一下我的開發環境&#xff1a; 操作系統&#xff1a;在windows下使用wubi安裝了ubuntu 10.10 hadoop版本&#xff1a;hadoop-0.20.2.tar.gz Eclipse版本&…

IDEA 修改JavaWeb的訪問路徑

問題描述 對于我這個剛剛使用IDEA不久的新手來說&#xff0c;能夠正常運行就不錯了,不過到了后面&#xff0c;可能會覺得IDEA給你分配的默認訪問路徑很不順手&#xff0c;比如訪問的時候需要通過: http://localhost:8080/web_war_exploded/ 來訪問&#xff0c;對于web_w…

防撞庫基本要求

專用安全要求 口令要求 設計要求說明 要求 是否滿足 密碼長度至少 8位字符&#xff0c;密碼復雜性要求至少包含以下4種類別中的2種&#xff1a;大寫字母、小寫字母、數字、特殊符號 必選 滿足 系統應具備對口令強度檢測的能力&#xff0c;并對用戶進行提示&#xff08;盡量不要…

odoo10 繼承(擴展)、模塊數據

一&#xff1a;繼承 在不改變底層對象的時候添加新的功能——這是通過繼承機制來實現的&#xff0c;作為在現有對象之上的修改層&#xff0c;這些修改可以發生在所有級別&#xff1a;模型&#xff0c;視圖和業務邏輯。不是直接修改現有模塊&#xff0c;而是創建一個新模塊以添加…

做一個vue的todolist列表

<template><div id"app"><input type"text" v-model"todo" ref"ip"/><button click"add()">新增</button><br/><br/><hr/><ul><li v-for"(item,key) in li…

hadoop+hive-0.10.0完全分布式安裝方法

hadoophive-0.10.0完全分布式安裝方法 1、jdk版本&#xff1a;jdk-7u60-linux-x64.tar.gz http://www.oracle.com/technetwork/cn/java/javase/downloads/jdk7-downloads-1880260.html 2、hive版本&#xff1a;hive-0.10.0.tar.gz https://archive.apache.org/dist/hive/hive-0…