当前位置: 首页 > news >正文

公关网站建设前端开发好学吗

公关网站建设,前端开发好学吗,女孩做网站合适吗,定制小程序制作一个需要多少钱文章目录SimGNN#xff1a;快速图相似度计算的神经网络方法1. 数据2. 模型2.1 python文件功能介绍2.2 重要函数和类的实现SimGNN#xff1a;快速图相似度计算的神经网络方法 原论文名称#xff1a;SimGNN: A Neural Network Approach to Fast Graph Similarity Computation… 文章目录SimGNN快速图相似度计算的神经网络方法1. 数据2. 模型2.1 python文件功能介绍2.2 重要函数和类的实现SimGNN快速图相似度计算的神经网络方法 原论文名称SimGNN: A Neural Network Approach to Fast Graph Similarity Computation 代码链接https://github.com/benedekrozemberczki/SimGNN 1. 数据 例子 {graph_1: [[0, 8], [0, 9], [0, 2], [0, 3], [0, 11], [1, 2], [1, 3], [1, 5], [1, 6], [1, 7], [2, 3], [2, 5], [2, 6], [2, 7], [2, 8], [2, 10], [2, 11], [3, 5], [3, 7], [3, 8], [3, 10], [3, 11], [4, 9], [4, 10], [4, 5], [4, 6], [4, 7], [5, 7], [5, 8], [5, 11], [6, 7], [6, 8], [6, 11], [7, 8], [7, 10], [7, 11], [8, 9], [10, 11]], ged: 32, graph_2: [[0, 1], [0, 2], [0, 4], [1, 8], [1, 10], [1, 2], [1, 7], [2, 4], [2, 7], [2, 9], [2, 11], [3, 10], [3, 11], [3, 5], [3, 6], [3, 7], [4, 9], [4, 11], [5, 8], [5, 9], [5, 6], [6, 9], [7, 9], [7, 10], [7, 11], [8, 9], [8, 10], [9, 10], [9, 11], [10, 11]], labels_2: [3, 5, 6, 5, 4, 4, 3, 6, 4, 8, 6, 6], labels_1: [5, 5, 9, 8, 5, 7, 6, 9, 7, 3, 5, 7]}一个样本包含五个属性 graph_1graph1的邻接矩阵 graph_2graph2的邻接矩阵 labels_1graph1的特征矩阵 labels_2graph2的特征矩阵 ged图相似度 2. 模型 2.1 python文件功能介绍 layers.py -- 包含注意力机制层和Neural Tensor Network层的实现 simgnn.py -- 模型实现 utils.py -- 一些辅助函数的实现 param_parser.py -- 参数 main.py -- 主函数--filters-1 INT Number of filter in 1st GCN layer. Default is 128. --filters-2 INT Number of filter in 2nd GCN layer. Default is 64. --filters-3 INT Number of filter in 3rd GCN layer. Default is 32. --tensor-neurons INT Neurons in tensor network layer. Default is 16. --bottle-neck-neurons INT Bottle neck layer neurons. Default is 16. --bins INT Number of histogram bins. Default is 16. --batch-size INT Number of pairs processed per batch. Default is 128. --epochs INT Number of SimGNN training epochs. Default is 5. --dropout FLOAT Dropout rate. Default is 0.5. --learning-rate FLOAT Learning rate. Default is 0.001. --weight-decay FLOAT Weight decay. Default is 10^-5. --histogram BOOL Include histogram features. Default is False.2.2 重要函数和类的实现 注意力机制层 __init__初始化函数 功能 导入需要的参数args 设置参数矩阵形状为[self.args.filters_3,self.args.filters_3]调用setup_weights() 初始化参数矩阵的值调用init_parameters() def __init__(self, args)::param args: Arguments object.super(AttentionModule, self).__init__()self.args argsself.setup_weights()self.init_parameters()setup_weights函数 def setup_weights(self):Defining weights.self.weight_matrix torch.nn.Parameter(torch.Tensor(self.args.filters_3,self.args.filters_3))init_parameters函数 def init_parameters(self):Initializing weights.torch.nn.init.xavier_uniform_(self.weight_matrix)forward函数 key: embedding value: embedding query: embedding_idef forward(self, embedding): # embedding形状[graph_node_num, self.args.filters_3]Making a forward propagation pass to create a graph level representation.:param embedding: Result of the GCN.:return representation: A graph level representation vector.global_context torch.mean(torch.matmul(embedding, self.weight_matrix), dim0) # [1,self.args.filters_3]transformed_global torch.tanh(global_context) # [1, self.args.filters_3]sigmoid_scores torch.sigmoid(torch.mm(embedding, transformed_global.view(-1, 1))) # [graph_node_num, 1]representation torch.mm(torch.t(embedding), sigmoid_scores) # [self.args.filters_3, 1]return representation整体代码 class AttentionModule(torch.nn.Module):SimGNN Attention Module to make a pass on graph.def __init__(self, args)::param args: Arguments object.super(AttentionModule, self).__init__()self.args argsself.setup_weights()self.init_parameters()def setup_weights(self):Defining weights.self.weight_matrix torch.nn.Parameter(torch.Tensor(self.args.filters_3,self.args.filters_3))def init_parameters(self):Initializing weights.torch.nn.init.xavier_uniform_(self.weight_matrix)def forward(self, embedding): Making a forward propagation pass to create a graph level representation.:param embedding: Result of the GCN.:return representation: A graph level representation vector.global_context torch.mean(torch.matmul(embedding, self.weight_matrix), dim0)transformed_global torch.tanh(global_context)sigmoid_scores torch.sigmoid(torch.mm(embedding, transformed_global.view(-1, 1)))representation torch.mm(torch.t(embedding), sigmoid_scores)return representationNeural Tensor Network层 class TenorNetworkModule(torch.nn.Module):SimGNN Tensor Network module to calculate similarity vector.def __init__(self, args)::param args: Arguments object.super(TenorNetworkModule, self).__init__()self.args argsself.setup_weights()self.init_parameters()def setup_weights(self):Defining weights.self.weight_matrix torch.nn.Parameter(torch.Tensor(self.args.filters_3,self.args.filters_3,self.args.tensor_neurons))self.weight_matrix_block torch.nn.Parameter(torch.Tensor(self.args.tensor_neurons,2*self.args.filters_3))self.bias torch.nn.Parameter(torch.Tensor(self.args.tensor_neurons, 1))def init_parameters(self):Initializing weights.torch.nn.init.xavier_uniform_(self.weight_matrix)torch.nn.init.xavier_uniform_(self.weight_matrix_block)torch.nn.init.xavier_uniform_(self.bias)def forward(self, embedding_1, embedding_2):# embedding_1:[self.args.filters_3, 1]# embedding_2:[self.args.filters_3, 1]Making a forward propagation pass to create a similarity vector.:param embedding_1: Result of the 1st embedding after attention.:param embedding_2: Result of the 2nd embedding after attention.:return scores: A similarity score vector.scoring torch.mm(torch.t(embedding_1), self.weight_matrix.view(self.args.filters_3, -1)) # [1, self.args.filters_3*self.args.tensor_neurons]scoring scoring.view(self.args.filters_3, self.args.tensor_neurons) # [self.args.filters_3, self.args.tensor_neurons]scoring torch.mm(torch.t(scoring), embedding_2) # [self.args.tensor_neurons, 1]combined_representation torch.cat((embedding_1, embedding_2)) # [2*self.args.filters_3, 1]block_scoring torch.mm(self.weight_matrix_block, combined_representation) # [self.args.tensor_neurons, 1]scores torch.nn.functional.relu(scoring block_scoring self.bias) # [self.args.tensor_neurons, 1]return scores SimGNN模型 __init__初始化函数 功能导入需要用到的参数argslabel数量number_of_labels构建模型调用setup_layers函数 def __init__(self, args, number_of_labels)::param args: Arguments object.:param number_of_labels: Number of node labels.super(SimGNN, self).__init__()self.args argsself.number_labels number_of_labels # 存放label种类数量self.setup_layers()calculate_bottleneck_features函数 功能是否要加上histogram层下半部分提取的embedding def calculate_bottleneck_features(self):Deciding the shape of the bottleneck layer.if self.args.histogram True:self.feature_count self.args.tensor_neurons self.args.binselse:self.feature_count self.args.tensor_neuronssetup_layers函数 功能构建模型包括三个图卷积层自注意力机制层Neural Tensor Network层两个线性层最后输出一个预测的值。 def setup_layers(self):Creating the layers.self.calculate_bottleneck_features()self.convolution_1 GCNConv(self.number_labels, self.args.filters_1)self.convolution_2 GCNConv(self.args.filters_1, self.args.filters_2)self.convolution_3 GCNConv(self.args.filters_2, self.args.filters_3)self.attention AttentionModule(self.args)self.tensor_network TenorNetworkModule(self.args)self.fully_connected_first torch.nn.Linear(self.feature_count,self.args.bottle_neck_neurons)self.scoring_layer torch.nn.Linear(self.args.bottle_neck_neurons, 1)calculate_histogram函数 def calculate_histogram(self, abstract_features_1, abstract_features_2):Calculate histogram from similarity matrix.:param abstract_features_1: Feature matrix for graph 1.:param abstract_features_2: Feature matrix for graph 2.:return hist: Histsogram of similarity scores.scores torch.mm(abstract_features_1, abstract_features_2).detach()scores scores.view(-1, 1)hist torch.histc(scores, binsself.args.bins)hist hist/torch.sum(hist)hist hist.view(1, -1)return histconvolutional_pass函数 def convolutional_pass(self, edge_index, features):Making convolutional pass.:param edge_index: Edge indices.:param features: Feature matrix.:return features: Absstract feature matrix.features self.convolution_1(features, edge_index)features torch.nn.functional.relu(features)features torch.nn.functional.dropout(features,pself.args.dropout,trainingself.training)features self.convolution_2(features, edge_index)features torch.nn.functional.relu(features)features torch.nn.functional.dropout(features,pself.args.dropout,trainingself.training)features self.convolution_3(features, edge_index)return featuresforward函数 功能运行神经网络预测结果 def forward(self, data):Forward pass with graphs.:param data: Data dictiyonary.:return score: Similarity score.edge_index_1 data[edge_index_1]edge_index_2 data[edge_index_2]features_1 data[features_1]features_2 data[features_2]# 图卷积的计算abstract_features_1 self.convolutional_pass(edge_index_1, features_1) # [graph1_num_node,self.args.filters_3]abstract_features_2 self.convolutional_pass(edge_index_2, features_2) # [graph2_num_node,self.args.filters_3]# 计算histogramif self.args.histogram True:hist self.calculate_histogram(abstract_features_1,torch.t(abstract_features_2))# 使用注意力机制层pooled_features_1 self.attention(abstract_features_1)pooled_features_2 self.attention(abstract_features_2)scores self.tensor_network(pooled_features_1, pooled_features_2) # [self.args.tensor_neurons, 1]scores torch.t(scores) # [1,self.args.tensor_neurons]# 合并注意力机制层和Neural Tensor Network层提取的特征if self.args.histogram True:scores torch.cat((scores, hist), dim1).view(1, -1) # [1, self.feature_count]# 获得预测分数由于标准答案使用了归一化所以最后要过一下sigmoid层scores torch.nn.functional.relu(self.fully_connected_first(scores)) # [1, self.args.bottle_neck_neurons]score torch.sigmoid(self.scoring_layer(scores)) # self.scoring_layer(scores): [1,1]return score整体代码 class SimGNN(torch.nn.Module):SimGNN: A Neural Network Approach to Fast Graph Similarity Computationhttps://arxiv.org/abs/1808.05689def __init__(self, args, number_of_labels)::param args: Arguments object.:param number_of_labels: Number of node labels.super(SimGNN, self).__init__()self.args argsself.number_labels number_of_labelsself.setup_layers()def calculate_bottleneck_features(self):Deciding the shape of the bottleneck layer.if self.args.histogram True:self.feature_count self.args.tensor_neurons self.args.binselse:self.feature_count self.args.tensor_neuronsdef setup_layers(self):Creating the layers.self.calculate_bottleneck_features()self.convolution_1 GCNConv(self.number_labels, self.args.filters_1)self.convolution_2 GCNConv(self.args.filters_1, self.args.filters_2)self.convolution_3 GCNConv(self.args.filters_2, self.args.filters_3)self.attention AttentionModule(self.args)self.tensor_network TenorNetworkModule(self.args)self.fully_connected_first torch.nn.Linear(self.feature_count,self.args.bottle_neck_neurons)self.scoring_layer torch.nn.Linear(self.args.bottle_neck_neurons, 1)def calculate_histogram(self, abstract_features_1, abstract_features_2):Calculate histogram from similarity matrix.:param abstract_features_1: Feature matrix for graph 1.:param abstract_features_2: Feature matrix for graph 2.:return hist: Histsogram of similarity scores.scores torch.mm(abstract_features_1, abstract_features_2).detach()scores scores.view(-1, 1)hist torch.histc(scores, binsself.args.bins)hist hist/torch.sum(hist)hist hist.view(1, -1)return histdef convolutional_pass(self, edge_index, features):Making convolutional pass.:param edge_index: Edge indices.:param features: Feature matrix.:return features: Absstract feature matrix.features self.convolution_1(features, edge_index)features torch.nn.functional.relu(features)features torch.nn.functional.dropout(features,pself.args.dropout,trainingself.training)features self.convolution_2(features, edge_index)features torch.nn.functional.relu(features)features torch.nn.functional.dropout(features,pself.args.dropout,trainingself.training)features self.convolution_3(features, edge_index)return featuresdef forward(self, data):Forward pass with graphs.:param data: Data dictiyonary.:return score: Similarity score.edge_index_1 data[edge_index_1]edge_index_2 data[edge_index_2]features_1 data[features_1]features_2 data[features_2]# 图卷积的计算abstract_features_1 self.convolutional_pass(edge_index_1, features_1) # [graph1_num_node,self.args.filters_3]abstract_features_2 self.convolutional_pass(edge_index_2, features_2) # [graph2_num_node,self.args.filters_3]# 计算histogramif self.args.histogram True:hist self.calculate_histogram(abstract_features_1,torch.t(abstract_features_2))# 使用注意力机制层pooled_features_1 self.attention(abstract_features_1)pooled_features_2 self.attention(abstract_features_2)scores self.tensor_network(pooled_features_1, pooled_features_2) # [self.args.tensor_neurons, 1]scores torch.t(scores) # [1,self.args.tensor_neurons]# 合并注意力机制层和Neural Tensor Network层提取的特征if self.args.histogram True:scores torch.cat((scores, hist), dim1).view(1, -1) # [1, self.feature_count]# 获得预测分数由于标准答案使用了归一化所以最后要过一下sigmoid层scores torch.nn.functional.relu(self.fully_connected_first(scores)) # [1, self.args.bottle_neck_neurons]score torch.sigmoid(self.scoring_layer(scores)) # self.scoring_layer(scores): [1,1]return score
文章转载自:
http://www.morning.xzqzd.cn.gov.cn.xzqzd.cn
http://www.morning.qflwp.cn.gov.cn.qflwp.cn
http://www.morning.mmosan.com.gov.cn.mmosan.com
http://www.morning.jljiangyan.com.gov.cn.jljiangyan.com
http://www.morning.mljtx.cn.gov.cn.mljtx.cn
http://www.morning.nzmw.cn.gov.cn.nzmw.cn
http://www.morning.hphqy.cn.gov.cn.hphqy.cn
http://www.morning.zbgqt.cn.gov.cn.zbgqt.cn
http://www.morning.bwygy.cn.gov.cn.bwygy.cn
http://www.morning.cpktd.cn.gov.cn.cpktd.cn
http://www.morning.xrrjb.cn.gov.cn.xrrjb.cn
http://www.morning.zrhhb.cn.gov.cn.zrhhb.cn
http://www.morning.lpmjr.cn.gov.cn.lpmjr.cn
http://www.morning.wbyqy.cn.gov.cn.wbyqy.cn
http://www.morning.tthmg.cn.gov.cn.tthmg.cn
http://www.morning.baohum.com.gov.cn.baohum.com
http://www.morning.mcpby.cn.gov.cn.mcpby.cn
http://www.morning.nwczt.cn.gov.cn.nwczt.cn
http://www.morning.zyrcf.cn.gov.cn.zyrcf.cn
http://www.morning.lznqb.cn.gov.cn.lznqb.cn
http://www.morning.fqmbt.cn.gov.cn.fqmbt.cn
http://www.morning.jcwhk.cn.gov.cn.jcwhk.cn
http://www.morning.lbgfz.cn.gov.cn.lbgfz.cn
http://www.morning.jbtwq.cn.gov.cn.jbtwq.cn
http://www.morning.wyzby.cn.gov.cn.wyzby.cn
http://www.morning.jcwt.cn.gov.cn.jcwt.cn
http://www.morning.skbbt.cn.gov.cn.skbbt.cn
http://www.morning.fthcq.cn.gov.cn.fthcq.cn
http://www.morning.phechi.com.gov.cn.phechi.com
http://www.morning.xkjqg.cn.gov.cn.xkjqg.cn
http://www.morning.clpdm.cn.gov.cn.clpdm.cn
http://www.morning.fthcn.cn.gov.cn.fthcn.cn
http://www.morning.tzpqc.cn.gov.cn.tzpqc.cn
http://www.morning.tnwwl.cn.gov.cn.tnwwl.cn
http://www.morning.rgrys.cn.gov.cn.rgrys.cn
http://www.morning.plnry.cn.gov.cn.plnry.cn
http://www.morning.qfkxj.cn.gov.cn.qfkxj.cn
http://www.morning.fwlch.cn.gov.cn.fwlch.cn
http://www.morning.juju8.cn.gov.cn.juju8.cn
http://www.morning.ldqrd.cn.gov.cn.ldqrd.cn
http://www.morning.frnjm.cn.gov.cn.frnjm.cn
http://www.morning.sbjbs.cn.gov.cn.sbjbs.cn
http://www.morning.lhrxq.cn.gov.cn.lhrxq.cn
http://www.morning.nbgfk.cn.gov.cn.nbgfk.cn
http://www.morning.alive-8.com.gov.cn.alive-8.com
http://www.morning.bjndc.com.gov.cn.bjndc.com
http://www.morning.xrwbc.cn.gov.cn.xrwbc.cn
http://www.morning.qptbn.cn.gov.cn.qptbn.cn
http://www.morning.jrqcj.cn.gov.cn.jrqcj.cn
http://www.morning.tbkqs.cn.gov.cn.tbkqs.cn
http://www.morning.btgxf.cn.gov.cn.btgxf.cn
http://www.morning.mrnnb.cn.gov.cn.mrnnb.cn
http://www.morning.qfgxk.cn.gov.cn.qfgxk.cn
http://www.morning.fygbq.cn.gov.cn.fygbq.cn
http://www.morning.nqmhf.cn.gov.cn.nqmhf.cn
http://www.morning.wmhlz.cn.gov.cn.wmhlz.cn
http://www.morning.jmbgl.cn.gov.cn.jmbgl.cn
http://www.morning.rtkz.cn.gov.cn.rtkz.cn
http://www.morning.cbmqq.cn.gov.cn.cbmqq.cn
http://www.morning.byywt.cn.gov.cn.byywt.cn
http://www.morning.ytfr.cn.gov.cn.ytfr.cn
http://www.morning.kbdrq.cn.gov.cn.kbdrq.cn
http://www.morning.gwmjy.cn.gov.cn.gwmjy.cn
http://www.morning.ssjry.cn.gov.cn.ssjry.cn
http://www.morning.hnkkf.cn.gov.cn.hnkkf.cn
http://www.morning.mpngp.cn.gov.cn.mpngp.cn
http://www.morning.zxwqt.cn.gov.cn.zxwqt.cn
http://www.morning.leboju.com.gov.cn.leboju.com
http://www.morning.cbndj.cn.gov.cn.cbndj.cn
http://www.morning.rbjth.cn.gov.cn.rbjth.cn
http://www.morning.pzjrm.cn.gov.cn.pzjrm.cn
http://www.morning.dhqg.cn.gov.cn.dhqg.cn
http://www.morning.yjmns.cn.gov.cn.yjmns.cn
http://www.morning.jfnbh.cn.gov.cn.jfnbh.cn
http://www.morning.bxnrx.cn.gov.cn.bxnrx.cn
http://www.morning.kntbk.cn.gov.cn.kntbk.cn
http://www.morning.lcxdm.cn.gov.cn.lcxdm.cn
http://www.morning.fwcnx.cn.gov.cn.fwcnx.cn
http://www.morning.rlfr.cn.gov.cn.rlfr.cn
http://www.morning.knswz.cn.gov.cn.knswz.cn
http://www.tj-hxxt.cn/news/234652.html

相关文章:

  • 给女朋友做网站 知乎wordpress电视直播代码
  • 网站完整模板大都会app最新版本下载
  • 甘南网站建设在线教育网站开发
  • 长治一般做一个网站需要多少钱互联网保险下架
  • 网站行高电子商务网站分类
  • 凯里小程序开发公司seo工作内容有哪些
  • 沙田镇网站建设公司网站301是什么意思
  • 外贸英文网站建设价格资阳建设网站
  • 自贡移动网站建设手机端 网站 模板
  • 广州网站seo招聘网站源码免费资源网
  • 松山湖仿做网站爱站小工具
  • 德文网站建设海南疾控发布问卷调查
  • 长春作网站建设的公司网站建设公司正规吗
  • 深圳手机网站制作公司wordpress导航栏编辑
  • 企业网站的布局wordpress怎么看前台
  • 英文网站建设用哪种字体友情链接网
  • 山东济南做网站公司免费制作视频软件app有哪些
  • 门户网站建设提案做一组静态页面网站多少钱
  • 扁平化设计网站代码wordpress mysql 配置文件
  • 深圳网站制作必选祥奔科技工程承包合同协议书
  • 惠阳网站建设公司网站建设的原因
  • 网站上线注意物联网的含义是什么意思
  • 电商网站目录优化中国建设建设工程造价管理协会网站
  • 给实体店老板做的网站企业建设网站需注意哪些内容
  • 黑河网站建设网站开发进度控制计划表
  • 好的平面设计灵感网站如何做文化传播公司网站
  • 大连做网站建设永嘉网站建设几
  • 主流建站公司阿里云云虚拟主机
  • 网站建设维护文档seo综合查询怎么关闭
  • 阿里云备案个人可以做网站吗网站建设教育培训