【python pytorch】Pytorch实现逻辑回归

2024-09-07 06:18

本文主要是介绍【python pytorch】Pytorch实现逻辑回归,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

pytorch 逻辑回归学习demo:

import torch
import torch.nn as nn
import torchvision.datasets as dsets
import torchvision.transforms as transforms
from torch.autograd import Variable# Hyper Parameters 
input_size = 784
num_classes = 10
num_epochs = 10
batch_size = 50
learning_rate = 0.001# MNIST Dataset (Images and Labels)
train_dataset = dsets.MNIST(root='./data', train=True, transform=transforms.ToTensor(),download=True)print(train_dataset)test_dataset = dsets.MNIST(root='./data', train=False, transform=transforms.ToTensor())# Dataset Loader (Input Pipline)
train_loader = torch.utils.data.DataLoader(dataset=train_dataset, batch_size=batch_size, shuffle=True)test_loader = torch.utils.data.DataLoader(dataset=test_dataset, batch_size=batch_size, shuffle=False)# Model
class LogisticRegression(nn.Module):def __init__(self, input_size, num_classes):super(LogisticRegression, self).__init__()self.linear = nn.Linear(input_size, num_classes)def forward(self, x):out = self.linear(x)return outmodel = LogisticRegression(input_size, num_classes)# Loss and Optimizer
# Softmax is internally computed.
# Set parameters to be updated.
criterion = nn.CrossEntropyLoss()  
optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate)  # Training the Model
for epoch in range(num_epochs):for i, (images, labels) in enumerate(train_loader):images = Variable(images.view(-1, 28*28))labels = Variable(labels)# Forward + Backward + Optimizeoptimizer.zero_grad()outputs = model(images)loss = criterion(outputs, labels)loss.backward()optimizer.step()if (i+1) % 100 == 0:print ('Epoch: [%d/%d], Step: [%d/%d], Loss: %.4f' % (epoch+1, num_epochs, i+1, len(train_dataset)//batch_size, loss.data[0]))# Test the Model
correct = 0
total = 0
for images, labels in test_loader:images = Variable(images.view(-1, 28*28))outputs = model(images)_, predicted = torch.max(outputs.data, 1)total += labels.size(0)correct += (predicted == labels).sum()print('Accuracy of the model on the 10000 test images: %d %%' % (100 * correct / total))# Save the Model
torch.save(model.state_dict(), 'model.pkl')

运行结果:

Epoch: [1/10], Step: [100/1200], Loss: 2.2397
Epoch: [1/10], Step: [200/1200], Loss: 2.1378
Epoch: [1/10], Step: [300/1200], Loss: 2.0500
Epoch: [1/10], Step: [400/1200], Loss: 1.9401
Epoch: [1/10], Step: [500/1200], Loss: 1.9175
Epoch: [1/10], Step: [600/1200], Loss: 1.8203
Epoch: [1/10], Step: [700/1200], Loss: 1.7322
Epoch: [1/10], Step: [800/1200], Loss: 1.6910
Epoch: [1/10], Step: [900/1200], Loss: 1.6678
Epoch: [1/10], Step: [1000/1200], Loss: 1.5577
Epoch: [1/10], Step: [1100/1200], Loss: 1.5113
Epoch: [1/10], Step: [1200/1200], Loss: 1.5671
Epoch: [2/10], Step: [100/1200], Loss: 1.4560
Epoch: [2/10], Step: [200/1200], Loss: 1.3170
Epoch: [2/10], Step: [300/1200], Loss: 1.3822
Epoch: [2/10], Step: [400/1200], Loss: 1.2793
Epoch: [2/10], Step: [500/1200], Loss: 1.4281
Epoch: [2/10], Step: [600/1200], Loss: 1.2763
Epoch: [2/10], Step: [700/1200], Loss: 1.1570
Epoch: [2/10], Step: [800/1200], Loss: 1.1050
Epoch: [2/10], Step: [900/1200], Loss: 1.1151
Epoch: [2/10], Step: [1000/1200], Loss: 1.0385
Epoch: [2/10], Step: [1100/1200], Loss: 1.0978
Epoch: [2/10], Step: [1200/1200], Loss: 1.0007
Epoch: [3/10], Step: [100/1200], Loss: 1.1849
Epoch: [3/10], Step: [200/1200], Loss: 1.0002
Epoch: [3/10], Step: [300/1200], Loss: 1.0198
Epoch: [3/10], Step: [400/1200], Loss: 0.9248
Epoch: [3/10], Step: [500/1200], Loss: 0.8974
Epoch: [3/10], Step: [600/1200], Loss: 1.1095
Epoch: [3/10], Step: [700/1200], Loss: 1.0900
Epoch: [3/10], Step: [800/1200], Loss: 1.0178
Epoch: [3/10], Step: [900/1200], Loss: 0.9809
Epoch: [3/10], Step: [1000/1200], Loss: 0.9831
Epoch: [3/10], Step: [1100/1200], Loss: 0.8701
Epoch: [3/10], Step: [1200/1200], Loss: 0.9855
Epoch: [4/10], Step: [100/1200], Loss: 0.9081
Epoch: [4/10], Step: [200/1200], Loss: 0.8791
Epoch: [4/10], Step: [300/1200], Loss: 0.7540
Epoch: [4/10], Step: [400/1200], Loss: 0.9443
Epoch: [4/10], Step: [500/1200], Loss: 0.9346
Epoch: [4/10], Step: [600/1200], Loss: 0.8974
Epoch: [4/10], Step: [700/1200], Loss: 0.8897
Epoch: [4/10], Step: [800/1200], Loss: 0.7797
Epoch: [4/10], Step: [900/1200], Loss: 0.8608
Epoch: [4/10], Step: [1000/1200], Loss: 0.9216
Epoch: [4/10], Step: [1100/1200], Loss: 0.8676
Epoch: [4/10], Step: [1200/1200], Loss: 0.9251
Epoch: [5/10], Step: [100/1200], Loss: 0.7640
Epoch: [5/10], Step: [200/1200], Loss: 0.6955
Epoch: [5/10], Step: [300/1200], Loss: 0.8431
Epoch: [5/10], Step: [400/1200], Loss: 0.8489
Epoch: [5/10], Step: [500/1200], Loss: 0.7191
Epoch: [5/10], Step: [600/1200], Loss: 0.6671
Epoch: [5/10], Step: [700/1200], Loss: 0.6980
Epoch: [5/10], Step: [800/1200], Loss: 0.6837
Epoch: [5/10], Step: [900/1200], Loss: 0.9087
Epoch: [5/10], Step: [1000/1200], Loss: 0.7784
Epoch: [5/10], Step: [1100/1200], Loss: 0.7890
Epoch: [5/10], Step: [1200/1200], Loss: 1.0480
Epoch: [6/10], Step: [100/1200], Loss: 0.5834
Epoch: [6/10], Step: [200/1200], Loss: 0.8300
Epoch: [6/10], Step: [300/1200], Loss: 0.8316
Epoch: [6/10], Step: [400/1200], Loss: 0.7249
Epoch: [6/10], Step: [500/1200], Loss: 0.6184
Epoch: [6/10], Step: [600/1200], Loss: 0.7505
Epoch: [6/10], Step: [700/1200], Loss: 0.6599
Epoch: [6/10], Step: [800/1200], Loss: 0.7170
Epoch: [6/10], Step: [900/1200], Loss: 0.6857
Epoch: [6/10], Step: [1000/1200], Loss: 0.6543
Epoch: [6/10], Step: [1100/1200], Loss: 0.5679
Epoch: [6/10], Step: [1200/1200], Loss: 0.8261
Epoch: [7/10], Step: [100/1200], Loss: 0.7144
Epoch: [7/10], Step: [200/1200], Loss: 0.7573
Epoch: [7/10], Step: [300/1200], Loss: 0.7254
Epoch: [7/10], Step: [400/1200], Loss: 0.5918
Epoch: [7/10], Step: [500/1200], Loss: 0.6959
Epoch: [7/10], Step: [600/1200], Loss: 0.7058
Epoch: [7/10], Step: [700/1200], Loss: 0.7382
Epoch: [7/10], Step: [800/1200], Loss: 0.7282
Epoch: [7/10], Step: [900/1200], Loss: 0.6750
Epoch: [7/10], Step: [1000/1200], Loss: 0.6019
Epoch: [7/10], Step: [1100/1200], Loss: 0.6615
Epoch: [7/10], Step: [1200/1200], Loss: 0.5851
Epoch: [8/10], Step: [100/1200], Loss: 0.6492
Epoch: [8/10], Step: [200/1200], Loss: 0.5439
Epoch: [8/10], Step: [300/1200], Loss: 0.6613
Epoch: [8/10], Step: [400/1200], Loss: 0.6486
Epoch: [8/10], Step: [500/1200], Loss: 0.8281
Epoch: [8/10], Step: [600/1200], Loss: 0.6263
Epoch: [8/10], Step: [700/1200], Loss: 0.6541
Epoch: [8/10], Step: [800/1200], Loss: 0.5080
Epoch: [8/10], Step: [900/1200], Loss: 0.7020
Epoch: [8/10], Step: [1000/1200], Loss: 0.6421
Epoch: [8/10], Step: [1100/1200], Loss: 0.6207
Epoch: [8/10], Step: [1200/1200], Loss: 0.9254
Epoch: [9/10], Step: [100/1200], Loss: 0.7428
Epoch: [9/10], Step: [200/1200], Loss: 0.6815
Epoch: [9/10], Step: [300/1200], Loss: 0.6418
Epoch: [9/10], Step: [400/1200], Loss: 0.7096
Epoch: [9/10], Step: [500/1200], Loss: 0.6846
Epoch: [9/10], Step: [600/1200], Loss: 0.5124
Epoch: [9/10], Step: [700/1200], Loss: 0.6300
Epoch: [9/10], Step: [800/1200], Loss: 0.6340
Epoch: [9/10], Step: [900/1200], Loss: 0.5593
Epoch: [9/10], Step: [1000/1200], Loss: 0.5706
Epoch: [9/10], Step: [1100/1200], Loss: 0.6258
Epoch: [9/10], Step: [1200/1200], Loss: 0.7627
Epoch: [10/10], Step: [100/1200], Loss: 0.5254
Epoch: [10/10], Step: [200/1200], Loss: 0.5318
Epoch: [10/10], Step: [300/1200], Loss: 0.5448
Epoch: [10/10], Step: [400/1200], Loss: 0.5634
Epoch: [10/10], Step: [500/1200], Loss: 0.6398
Epoch: [10/10], Step: [600/1200], Loss: 0.7158
Epoch: [10/10], Step: [700/1200], Loss: 0.6169
Epoch: [10/10], Step: [800/1200], Loss: 0.5641
Epoch: [10/10], Step: [900/1200], Loss: 0.5698
Epoch: [10/10], Step: [1000/1200], Loss: 0.5612
Epoch: [10/10], Step: [1100/1200], Loss: 0.5126
Epoch: [10/10], Step: [1200/1200], Loss: 0.6746
Accuracy of the model on the 10000 test images: 87 %Process finished with exit code 0

这篇关于【python pytorch】Pytorch实现逻辑回归的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/1144288

相关文章

Python 中的异步与同步深度解析(实践记录)

《Python中的异步与同步深度解析(实践记录)》在Python编程世界里,异步和同步的概念是理解程序执行流程和性能优化的关键,这篇文章将带你深入了解它们的差异,以及阻塞和非阻塞的特性,同时通过实际... 目录python中的异步与同步:深度解析与实践异步与同步的定义异步同步阻塞与非阻塞的概念阻塞非阻塞同步

Python Dash框架在数据可视化仪表板中的应用与实践记录

《PythonDash框架在数据可视化仪表板中的应用与实践记录》Python的PlotlyDash库提供了一种简便且强大的方式来构建和展示互动式数据仪表板,本篇文章将深入探讨如何使用Dash设计一... 目录python Dash框架在数据可视化仪表板中的应用与实践1. 什么是Plotly Dash?1.1

使用Java实现通用树形结构构建工具类

《使用Java实现通用树形结构构建工具类》这篇文章主要为大家详细介绍了如何使用Java实现通用树形结构构建工具类,文中的示例代码讲解详细,感兴趣的小伙伴可以跟随小编一起学习一下... 目录完整代码一、设计思想与核心功能二、核心实现原理1. 数据结构准备阶段2. 循环依赖检测算法3. 树形结构构建4. 搜索子

MySQL多列IN查询的实现

《MySQL多列IN查询的实现》多列IN查询是一种强大的筛选工具,它允许通过多字段组合快速过滤数据,本文主要介绍了MySQL多列IN查询的实现,具有一定的参考价值,感兴趣的可以了解一下... 目录一、基础语法:多列 IN 的两种写法1. 直接值列表2. 子查询二、对比传统 OR 的写法三、性能分析与优化1.

在C#中调用Python代码的两种实现方式

《在C#中调用Python代码的两种实现方式》:本文主要介绍在C#中调用Python代码的两种实现方式,具有很好的参考价值,希望对大家有所帮助,如有错误或未考虑完全的地方,望不吝赐教... 目录C#调用python代码的方式1. 使用 Python.NET2. 使用外部进程调用 Python 脚本总结C#调

Python下载Pandas包的步骤

《Python下载Pandas包的步骤》:本文主要介绍Python下载Pandas包的步骤,在python中安装pandas库,我采取的方法是用PIP的方法在Python目标位置进行安装,本文给大... 目录安装步骤1、首先找到我们安装python的目录2、使用命令行到Python安装目录下3、我们回到Py

Python GUI框架中的PyQt详解

《PythonGUI框架中的PyQt详解》PyQt是Python语言中最强大且广泛应用的GUI框架之一,基于Qt库的Python绑定实现,本文将深入解析PyQt的核心模块,并通过代码示例展示其应用场... 目录一、PyQt核心模块概览二、核心模块详解与示例1. QtCore - 核心基础模块2. QtWid

Python实现自动化接收与处理手机验证码

《Python实现自动化接收与处理手机验证码》在移动互联网时代,短信验证码已成为身份验证、账号注册等环节的重要安全手段,本文将介绍如何利用Python实现验证码的自动接收,识别与转发,需要的可以参考下... 目录引言一、准备工作1.1 硬件与软件需求1.2 环境配置二、核心功能实现2.1 短信监听与获取2.

使用Python实现获取网页指定内容

《使用Python实现获取网页指定内容》在当今互联网时代,网页数据抓取是一项非常重要的技能,本文将带你从零开始学习如何使用Python获取网页中的指定内容,希望对大家有所帮助... 目录引言1. 网页抓取的基本概念2. python中的网页抓取库3. 安装必要的库4. 发送HTTP请求并获取网页内容5. 解

利用Python开发Markdown表格结构转换为Excel工具

《利用Python开发Markdown表格结构转换为Excel工具》在数据管理和文档编写过程中,我们经常使用Markdown来记录表格数据,但它没有Excel使用方便,所以本文将使用Python编写一... 目录1.完整代码2. 项目概述3. 代码解析3.1 依赖库3.2 GUI 设计3.3 解析 Mark