python實(shí)現(xiàn)梯度下降和邏輯回歸
本文實(shí)例為大家分享了python實(shí)現(xiàn)梯度下降和邏輯回歸的具體代碼,供大家參考,具體內(nèi)容如下
import numpy as npimport pandas as pdimport os data = pd.read_csv('iris.csv') # 這里的iris數(shù)據(jù)已做過處理m, n = data.shapedataMatIn = np.ones((m, n))dataMatIn[:, :-1] = data.ix[:, :-1]classLabels = data.ix[:, -1] # sigmoid函數(shù)和初始化數(shù)據(jù)def sigmoid(z): return 1 / (1 + np.exp(-z)) # 隨機(jī)梯度下降def Stocgrad_descent(dataMatIn, classLabels): dataMatrix = np.mat(dataMatIn) # 訓(xùn)練集 labelMat = np.mat(classLabels).transpose() # y值 m, n = np.shape(dataMatrix) # m:dataMatrix的行數(shù),n:dataMatrix的列數(shù) weights = np.ones((n, 1)) # 初始化回歸系數(shù)(n, 1) alpha = 0.001 # 步長 maxCycle = 500 # 最大循環(huán)次數(shù) epsilon = 0.001 error = np.zeros((n,1)) for i in range(maxCycle): for j in range(m): h = sigmoid(dataMatrix * weights) # sigmoid 函數(shù) weights = weights + alpha * dataMatrix.transpose() * (labelMat - h) # 梯度 if np.linalg.norm(weights - error) < epsilon: break else: error = weights return weights # 邏輯回歸def pred_result(dataMatIn): dataMatrix = np.mat(dataMatIn) r = Stocgrad_descent(dataMatIn, classLabels) p = sigmoid(dataMatrix * r) # 根據(jù)模型預(yù)測的概率 # 預(yù)測結(jié)果二值化 pred = [] for i in range(len(data)): if p[i] > 0.5: pred.append(1) else: pred.append(0) data['pred'] = pred os.remove('data_and_pred.csv') # 刪除List_lost_customers數(shù)據(jù)集 # 第一次運(yùn)行此代碼時此步驟不要 data.to_csv('data_and_pred.csv', index=False, encoding='utf_8_sig') # 數(shù)據(jù)集保存pred_result(dataMatIn)
以上就是本文的全部內(nèi)容,希望對大家的學(xué)習(xí)有所幫助,也希望大家多多支持好吧啦網(wǎng)。
相關(guān)文章:
1. Python 多張圖片合并成一個pdf的參考示例2. Python類綁定方法及非綁定方法實(shí)例解析3. python實(shí)現(xiàn)處理mysql結(jié)果輸出方式4. SpringBoot與SpringMVC中參數(shù)傳遞的原理解析5. 前端html+css實(shí)現(xiàn)動態(tài)生日快樂代碼6. python實(shí)現(xiàn)感知機(jī)模型的示例7. js和jquery判斷數(shù)據(jù)類型的4種方法總結(jié)8. Properties 持久的屬性集的實(shí)例詳解9. 爬取今日頭條Ajax請求10. js實(shí)現(xiàn)簡單抽獎功能
