本文主要是介绍Python俄罗斯方块可操纵卷积分类 | 稀疏辨识算法 | 微分方程神经求解器,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
🎯要点
🎯组卷积网络:实现循环组,可视化组动作,实现提升卷积核,MNIST 训练数据集训练组卷积网络的泛化能力 | 🎯可操控卷积网络:紧群的表征与调和分析,代码验证常规表征结果,不可约表征实现,傅里叶变换对群调和分析,实现可操控卷积网络 | 🎯深度概率模型:给定高维和结构化对单变量响应变量建模,实现分类响应模型,顺序响应模型、序列标记模型 | 🎯深度离散潜变量模型:使用FashionMNIST数据集,实现伯努利分布的乘积,实现均匀分类分布,测试先验分布,实现条件概率分布 | 🎯流生成模型 | 🎯超参数调优和多GPU编程 | 🎯贝叶斯神经网络实现 | 🎯非线性动力系统稀疏辨识算法 | 🎯偏微分方程神经网络求解器 | 🎯常微分方程神经网络求解器
🎯语言模型:Python发票合同 | 解缠注意力语言模型
🎯非线性系统:Julia和Python蛛网图轨道图庞加莱截面曲面确定性非线性系统
🍇Python俄罗斯方块可操纵卷积分类
方块,有时也被称为四块、方块,骨牌或四格,是所有已知的俄罗斯方块游戏中使用的方块。它们有七种形状,都可以旋转然后放下。方块的面积都是四个方格。在某些俄罗斯方块游戏中,它们的颜色会有所不同。四格骨牌是由四个方块组成的多格骨牌。七个单面四格骨牌分别是 I、O、T、S、Z、J 和 L。
💦使用门实现模型拟合俄罗斯方块数据集
import loggingimport torch
from torch_cluster import radius_graph
from torch_geometric.data import Data, DataLoader
from torch_scatter import scatterdef tblock():pos = [[(0, 0, 0), (0, 0, 1), (1, 0, 0), (1, 1, 0)], [(0, 0, 0), (0, 0, 1), (1, 0, 0), (1, -1, 0)], [(0, 0, 0), (1, 0, 0), (0, 1, 0), (1, 1, 0)], [(0, 0, 0), (0, 0, 1), (0, 0, 2), (0, 0, 3)], [(0, 0, 0), (0, 0, 1), (0, 1, 0), (1, 0, 0)], [(0, 0, 0), (0, 0, 1), (0, 0, 2), (0, 1, 0)], [(0, 0, 0), (0, 0, 1), (0, 0, 2), (0, 1, 1)], [(0, 0, 0), (1, 0, 0), (1, 1, 0), (2, 1, 0)], ]pos = torch.tensor(pos, dtype=torch.get_default_dtype())labels = torch.tensor([[+1, 0, 0, 0, 0, 0, 0], [-1, 0, 0, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 1], ],dtype=torch.get_default_dtype(),)def mean_std(name, x) -> None:print(f"{name} \t{x.mean():.1f} ± ({x.var(0).mean().sqrt():.1f}|{x.std():.1f})")class Convolution(torch.nn.Module):def __init__(self, irreps_in, irreps_sh, irreps_out, num_neighbors) -> None:super().__init__()self.num_neighbors = num_neighborstp = FullyConnectedTensorProduct(irreps_in1=irreps_in,irreps_in2=irreps_sh,irreps_out=irreps_out,internal_weights=False,shared_weights=False,)self.fc = FullyConnectedNet([3, 256, tp.weight_numel], torch.relu)self.tp = tpself.irreps_out = self.tp.irreps_outdef forward(self, node_features, edge_src, edge_dst, edge_attr, edge_scalars) -> torch.Tensor:weight = self.fc(edge_scalars)edge_features = self.tp(node_features[edge_src], edge_attr, weight)node_features = scatter(edge_features, edge_dst, dim=0).div(self.num_neighbors**0.5)return node_featuresclass Network(torch.nn.Module):def __init__(self) -> None:super().__init__()self.num_neighbors = 3.8 self.irreps_sh = o3.Irreps.spherical_harmonics(3)irreps = self.irreps_sh# First layer with gategate = Gate("16x0e + 16x0o",[torch.relu, torch.abs], "8x0e + 8x0o + 8x0e + 8x0o",[torch.relu, torch.tanh, torch.relu, torch.tanh], # gates (scalars)"16x1o + 16x1e", )self.conv = Convolution(irreps, self.irreps_sh, gate.irreps_in, self.num_neighbors)self.gate = gateirreps = self.gate.irreps_out# Final layerself.final = Convolution(irreps, self.irreps_sh, "0o + 6x0e", self.num_neighbors)self.irreps_out = self.final.irreps_out
💦多项式拟合俄罗斯方块数据集
import loggingimport torch
from torch_cluster import radius_graph
from torch_geometric.data import Data, DataLoader
from torch_scatter import scatterdef tblock():pos = [[(0, 0, 0), (0, 0, 1), (1, 0, 0), (1, 1, 0)], [(0, 0, 0), (0, 0, 1), (1, 0, 0), (1, -1, 0)], [(0, 0, 0), (1, 0, 0), (0, 1, 0), (1, 1, 0)], [(0, 0, 0), (0, 0, 1), (0, 0, 2), (0, 0, 3)], [(0, 0, 0), (0, 0, 1), (0, 1, 0), (1, 0, 0)], [(0, 0, 0), (0, 0, 1), (0, 0, 2), (0, 1, 0)], [(0, 0, 0), (0, 0, 1), (0, 0, 2), (0, 1, 1)], [(0, 0, 0), (1, 0, 0), (1, 1, 0), (2, 1, 0)], ]pos = torch.tensor(pos, dtype=torch.get_default_dtype())labels = torch.tensor([[+1, 0, 0, 0, 0, 0, 0], [-1, 0, 0, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 1], ],dtype=torch.get_default_dtype(),)pos = torch.einsum("zij,zaj->zai", o3.rand_matrix(len(pos)), pos)dataset = [Data(pos=pos) for pos in pos]data = next(iter(DataLoader(dataset, batch_size=len(dataset))))return data, labelsclass InvariantPolynomial(torch.nn.Module):def __init__(self) -> None:super().__init__()self.irreps_sh: o3.Irreps = o3.Irreps.spherical_harmonics(3)irreps_mid = o3.Irreps("64x0e + 24x1e + 24x1o + 16x2e + 16x2o")irreps_out = o3.Irreps("0o + 6x0e")self.tp1 = FullyConnectedTensorProduct(irreps_in1=self.irreps_sh,irreps_in2=self.irreps_sh,irreps_out=irreps_mid,)self.tp2 = FullyConnectedTensorProduct(irreps_in1=irreps_mid,irreps_in2=self.irreps_sh,irreps_out=irreps_out,)self.irreps_out = self.tp2.irreps_outdef forward(self, data) -> torch.Tensor:num_neighbors = 2 num_nodes = 4 # typical number of nodesedge_src, edge_dst = radius_graph(x=data.pos, r=1.1, batch=data.batch) edge_vec = data.pos[edge_src] - data.pos[edge_dst]edge_sh = o3.spherical_harmonics(l=self.irreps_sh,x=edge_vec,normalize=False, normalization="component",)node_features = scatter(edge_sh, edge_dst, dim=0).div(num_neighbors**0.5)edge_features = self.tp1(node_features[edge_src], edge_sh)node_features = scatter(edge_features, edge_dst, dim=0).div(num_neighbors**0.5)edge_features = self.tp2(node_features[edge_src], edge_sh)node_features = scatter(edge_features, edge_dst, dim=0).div(num_neighbors**0.5)return scatter(node_features, data.batch, dim=0).div(num_nodes**0.5)def main() -> None:data, labels = tetris()f = InvariantPolynomial()optim = torch.optim.Adam(f.parameters(), lr=1e-2)# == Train ==for step in range(200):pred = f(data)loss = (pred - labels).pow(2).sum()optim.zero_grad()loss.backward()optim.step()if step % 10 == 0:accuracy = pred.round().eq(labels).all(dim=1).double().mean(dim=0).item()print(f"epoch {step:5d} | loss {loss:<10.1f} | {100 * accuracy:5.1f}% accuracy")def test() -> None:data, labels = tetris()f = InvariantPolynomial()pred = f(data)loss = (pred - labels).pow(2).sum()loss.backward()rotated_data, _ = tetris()error = f(rotated_data) - f(data)assert error.abs().max() < 1e-5
👉参阅一:计算思维
👉参阅二:亚图跨际
这篇关于Python俄罗斯方块可操纵卷积分类 | 稀疏辨识算法 | 微分方程神经求解器的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!