庆阳门户网站,网站的面包屑怎么做的,广州网站建设:,今朝装饰老房装修套餐Flink 系列文章
1、Flink 部署、概念介绍、source、transformation、sink使用示例、四大基石介绍和示例等系列综合文章链接
13、Flink 的table api与sql的基本概念、通用api介绍及入门示例 14、Flink 的table api与sql之数据类型: 内置数据类型以及它们的属性 15、Flink 的ta…Flink 系列文章
1、Flink 部署、概念介绍、source、transformation、sink使用示例、四大基石介绍和示例等系列综合文章链接
13、Flink 的table api与sql的基本概念、通用api介绍及入门示例 14、Flink 的table api与sql之数据类型: 内置数据类型以及它们的属性 15、Flink 的table api与sql之流式概念-详解的介绍了动态表、时间属性配置如何处理更新结果、时态表、流上的join、流上的确定性以及查询配置 16、Flink 的table api与sql之连接外部系统: 读写外部系统的连接器和格式以及FileSystem示例1 16、Flink 的table api与sql之连接外部系统: 读写外部系统的连接器和格式以及Elasticsearch示例2 16、Flink 的table api与sql之连接外部系统: 读写外部系统的连接器和格式以及Apache Kafka示例3 16、Flink 的table api与sql之连接外部系统: 读写外部系统的连接器和格式以及JDBC示例4
16、Flink 的table api与sql之连接外部系统: 读写外部系统的连接器和格式以及Apache Hive示例6
20、Flink SQL之SQL Client: 不用编写代码就可以尝试 Flink SQL可以直接提交 SQL 任务到集群上
22、Flink 的table api与sql之创建表的DDL 24、Flink 的table api与sql之Catalogs介绍、类型、java api和sql实现ddl、java api和sql操作catalog-1 24、Flink 的table api与sql之Catalogsjava api操作数据库、表-2 24、Flink 的table api与sql之Catalogsjava api操作视图-3
26、Flink 的SQL之概览与入门示例 27、Flink 的SQL之SELECT (select、where、distinct、order by、limit、集合操作和去重)介绍及详细示例1 27、Flink 的SQL之SELECT (SQL Hints 和 Joins)介绍及详细示例2 27、Flink 的SQL之SELECT (窗口函数)介绍及详细示例3 27、Flink 的SQL之SELECT (窗口聚合)介绍及详细示例4 27、Flink 的SQL之SELECT (Group Aggregation分组聚合、Over Aggregation Over聚合 和 Window Join 窗口关联)介绍及详细示例5 27、Flink 的SQL之SELECT (Top-N、Window Top-N 窗口 Top-N 和 Window Deduplication 窗口去重)介绍及详细示例6 27、Flink 的SQL之SELECT (Pattern Recognition 模式检测)介绍及详细示例7
29、Flink SQL之DESCRIBE、EXPLAIN、USE、SHOW、LOAD、UNLOAD、SET、RESET、JAR、JOB Statements、UPDATE、DELETE1 29、Flink SQL之DESCRIBE、EXPLAIN、USE、SHOW、LOAD、UNLOAD、SET、RESET、JAR、JOB Statements、UPDATE、DELETE2 30、Flink SQL之SQL 客户端通过kafka和filesystem的例子介绍了配置文件使用-表、视图等 32、Flink table api和SQL 之用户自定义 Sources Sinks实现及详细示例 41、Flink之Hive 方言介绍及详细示例 42、Flink 的table api与sql之Hive Catalog 43、Flink之Hive 读写及详细验证示例 44、Flink之module模块介绍及使用示例和Flink SQL使用hive内置函数及自定义函数详细示例–网上有些说法好像是错误的 文章目录 Flink 系列文章五、Catalog API3、视图操作1、官方示例2、SQL创建HIVE 视图示例1、maven依赖2、代码3、运行结果 3、API创建Hive 视图示例1、maven依赖2、代码3、运行结果 本文简单介绍了通过java api操作视图提供了三个示例即sql实现和java api的两种实现方式。 本文依赖flink和hive、hadoop集群能正常使用。 本文示例java api的实现是通过Flink 1.13.5版本做的示例SQL 如果没有特别说明则是Flink 1.17版本。
五、Catalog API
3、视图操作
1、官方示例
// create view
catalog.createTable(new ObjectPath(mydb, myview), new CatalogViewImpl(...), false);// drop view
catalog.dropTable(new ObjectPath(mydb, myview), false);// alter view
catalog.alterTable(new ObjectPath(mydb, mytable), new CatalogViewImpl(...), false);// rename view
catalog.renameTable(new ObjectPath(mydb, myview), my_new_view, false);// get view
catalog.getTable(myview);// check if a view exist or not
catalog.tableExists(mytable);// list views in a database
catalog.listViews(mydb);
2、SQL创建HIVE 视图示例
1、maven依赖
propertiesencodingUTF-8/encodingproject.build.sourceEncodingUTF-8/project.build.sourceEncodingmaven.compiler.source1.8/maven.compiler.sourcemaven.compiler.target1.8/maven.compiler.targetjava.version1.8/java.versionscala.version2.12/scala.versionflink.version1.13.6/flink.version/propertiesdependenciesdependencygroupIdorg.apache.flink/groupIdartifactIdflink-clients_2.11/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-scala_2.11/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-java/artifactIdversion${flink.version}/versionscopeprovided/scope /dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-streaming-scala_2.11/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-streaming-java_2.11/artifactIdversion${flink.version}/versionscopeprovided/scope/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-api-scala-bridge_2.11/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-api-java-bridge_2.11/artifactIdversion${flink.version}/version/dependency!-- blink执行计划,1.11默认的 --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-planner-blink_2.11/artifactIdversion${flink.version}/versionscopeprovided/scope /dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-common/artifactIdversion${flink.version}/version/dependency!-- flink连接器 --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-connector-kafka_2.12/artifactIdversion${flink.version}/version!-- scopeprovided/scope --/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-sql-connector-kafka_2.12/artifactIdversion${flink.version}/versionscopeprovided/scope/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-connector-jdbc_2.12/artifactIdversion${flink.version}/versionscopeprovided/scope/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-csv/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-json/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-connector-hive_2.12/artifactIdversion${flink.version}/versionscopeprovided/scope /dependencydependencygroupIdorg.apache.hive/groupIdartifactIdhive-metastore/artifactIdversion2.1.0/version/dependencydependencygroupIdorg.apache.hive/groupIdartifactIdhive-exec/artifactIdversion3.1.2/versionscopeprovided/scope /dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-shaded-hadoop-2-uber/artifactIdversion2.7.5-10.0/version!-- scopeprovided/scope --/dependencydependencygroupIdmysql/groupIdartifactIdmysql-connector-java/artifactIdversion5.1.38/versionscopeprovided/scope!--version8.0.20/version --/dependency!-- 日志 --dependencygroupIdorg.slf4j/groupIdartifactIdslf4j-log4j12/artifactIdversion1.7.7/versionscoperuntime/scope/dependencydependencygroupIdlog4j/groupIdartifactIdlog4j/artifactIdversion1.2.17/versionscoperuntime/scope/dependencydependencygroupIdcom.alibaba/groupIdartifactIdfastjson/artifactIdversion1.2.44/version/dependencydependencygroupIdorg.projectlombok/groupIdartifactIdlombok/artifactIdversion1.18.2/version!-- scopeprovided/scope --/dependency/dependenciesbuildsourceDirectorysrc/main/java/sourceDirectoryplugins!-- 编译插件 --plugingroupIdorg.apache.maven.plugins/groupIdartifactIdmaven-compiler-plugin/artifactIdversion3.5.1/versionconfigurationsource1.8/sourcetarget1.8/target!--encoding${project.build.sourceEncoding}/encoding --/configuration/pluginplugingroupIdorg.apache.maven.plugins/groupIdartifactIdmaven-surefire-plugin/artifactIdversion2.18.1/versionconfigurationuseFilefalse/useFiledisableXmlReporttrue/disableXmlReportincludesinclude**/*Test.*/includeinclude**/*Suite.*/include/includes/configuration/plugin!-- 打包插件(会包含所有依赖) --plugingroupIdorg.apache.maven.plugins/groupIdartifactIdmaven-shade-plugin/artifactIdversion2.3/versionexecutionsexecutionphasepackage/phasegoalsgoalshade/goal/goalsconfigurationfiltersfilterartifact*:*/artifactexcludes!-- zip -d learn_spark.jar META-INF/*.RSA META-INF/*.DSA META-INF/*.SF --excludeMETA-INF/*.SF/excludeexcludeMETA-INF/*.DSA/excludeexcludeMETA-INF/*.RSA/exclude/excludes/filter/filterstransformerstransformerimplementationorg.apache.maven.plugins.shade.resource.ManifestResourceTransformer!-- 设置jar包的入口类(可选) --mainClass org.table_sql.TestHiveViewBySQLDemo/mainClass/transformer/transformers/configuration/execution/executions/plugin/plugins/build2、代码
package org.table_sql;import java.util.HashMap;
import java.util.List;import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.SqlDialect;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import org.apache.flink.table.catalog.CatalogDatabaseImpl;
import org.apache.flink.table.catalog.CatalogView;
import org.apache.flink.table.catalog.ObjectPath;
import org.apache.flink.table.catalog.hive.HiveCatalog;
import org.apache.flink.table.module.hive.HiveModule;
import org.apache.flink.types.Row;
import org.apache.flink.util.CollectionUtil;/*** author alanchan**/
public class TestHiveViewBySQLDemo {public static final String tableName viewtest;public static final String hive_create_table_sql CREATE TABLE tableName (\n id INT,\n name STRING,\n age INT ) TBLPROPERTIES (\n sink.partition-commit.delay5 s,\n sink.partition-commit.triggerpartition-time,\n sink.partition-commit.policy.kindmetastore,success-file );/*** param args* throws Exception*/public static void main(String[] args) throws Exception {StreamExecutionEnvironment env StreamExecutionEnvironment.getExecutionEnvironment();StreamTableEnvironment tenv StreamTableEnvironment.create(env);String moduleName myhive;String hiveVersion 3.1.2;tenv.loadModule(moduleName, new HiveModule(hiveVersion));String name alan_hive;String defaultDatabase default;String databaseName viewtest_db;String hiveConfDir /usr/local/bigdata/apache-hive-3.1.2-bin/conf;HiveCatalog hiveCatalog new HiveCatalog(name, defaultDatabase, hiveConfDir);tenv.registerCatalog(name, hiveCatalog);tenv.useCatalog(name);tenv.listDatabases();hiveCatalog.createDatabase(databaseName, new CatalogDatabaseImpl(new HashMap(), hiveConfDir) {}, true);// tenv.executeSql(create database databaseName);tenv.useDatabase(databaseName);// 创建第一个视图viewName_byTableString selectSQL select * from tableName;String viewName_byTable test_view_table_V;String createViewSQL create view viewName_byTable as selectSQL;tenv.getConfig().setSqlDialect(SqlDialect.HIVE);tenv.executeSql(hive_create_table_sql);// tenv.getConfig().setSqlDialect(SqlDialect.DEFAULT);String insertSQL insert into tableName values (1,alan,18);tenv.executeSql(insertSQL);tenv.executeSql(createViewSQL);tenv.listViews();CatalogView catalogView (CatalogView) hiveCatalog.getTable(new ObjectPath(databaseName, viewName_byTable));ListRow results CollectionUtil.iteratorToList(tenv.executeSql(select * from viewName_byTable).collect());for (Row row : results) {System.out.println(test_view_table_V: row.toString());}// 创建第二个视图String viewName_byView test_view_view_V;tenv.executeSql(create view viewName_byView (v2_id,v2_name,v2_age) comment test_view_view_V comment as select * from viewName_byTable);catalogView (CatalogView) hiveCatalog.getTable(new ObjectPath(databaseName, viewName_byView));results CollectionUtil.iteratorToList(tenv.executeSql(select * from viewName_byView).collect());System.out.println(test_view_view_V comment : catalogView.getComment());for (Row row : results) {System.out.println(test_view_view_V : row.toString());}tenv.executeSql(drop database databaseName cascade);}}
3、运行结果
前提是flink的集群可用。使用maven打包成jar。
[alanchanserver2 bin]$ flink run /usr/local/bigdata/flink-1.13.5/examples/table/table_sql-0.0.2-SNAPSHOT.jarHive Session ID ed6d5c9b-e00f-4881-840d-24c72aba6db7
Hive Session ID 14445dc8-1f08-4f0f-bb45-aba8c6f52174
Job has been submitted with JobID bff7b59367bd5de6e778b442c4cc4404
Hive Session ID 4c16f4fc-4c10-4353-b322-e6633e3ebe3d
Hive Session ID 57949f09-bdcb-497f-a85c-ed9766fc4ce3
2023-10-13 02:42:24,891 INFO org.apache.hadoop.mapred.FileInputFormat [] - Total input files to process : 0
Job has been submitted with JobID 80e48bb76e3d580412fdcdc434a8a979
test_view_table_V: I[1, alan, 18]
Hive Session ID a73d5b93-2129-4159-ad5e-0814df77e987
Hive Session ID e4ae1a79-4d5e-4835-81de-ebc2041eedf9
2023-10-13 02:42:33,648 INFO org.apache.hadoop.mapred.FileInputFormat [] - Total input files to process : 1
Job has been submitted with JobID c228d9ce3bdce91dc68bff75d14db1e5
test_view_view_V comment : test_view_view_V comment
test_view_view_V : I[1, alan, 18]
Hive Session ID e4a38393-d760-4bd3-8d8b-864cbe0daba7
3、API创建Hive 视图示例
通过api创建视图相对比较麻烦且存在版本更新的过期方法情况。 通过TableSchema和CatalogViewImpl创建视图则已过期当前推荐使用通过CatalogView和ResolvedSchema来创建视图。 另外需要注意的是下面两个参数的区别 String originalQuery,原始的sql String expandedQuery,带有数据库名称的表甚至包含hivecatalog
例如如果使用default作为默认的数据库查询语句为select * from test1则 originalQuery ”select name,value from test1“即可 expandedQuery “selecttest1.name, test1.value from default.test1”
修改、删除视图等操作比较简单不再赘述。
1、maven依赖
此处使用的依赖与上示例一致mainclass变成本示例的类不再赘述。
2、代码
import static org.apache.flink.util.Preconditions.checkNotNull;import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import org.apache.flink.api.common.typeinfo.Types;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.table.api.DataTypes;
import org.apache.flink.table.api.Schema;
import org.apache.flink.table.api.SqlDialect;
import org.apache.flink.table.api.TableSchema;
import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
import org.apache.flink.table.catalog.CatalogBaseTable;
import org.apache.flink.table.catalog.CatalogDatabaseImpl;
import org.apache.flink.table.catalog.CatalogView;
import org.apache.flink.table.catalog.CatalogViewImpl;
import org.apache.flink.table.catalog.ObjectPath;
import org.apache.flink.table.catalog.ResolvedCatalogView;
import org.apache.flink.table.catalog.ResolvedSchema;
import org.apache.flink.table.catalog.exceptions.CatalogException;
import org.apache.flink.table.catalog.exceptions.DatabaseNotExistException;
import org.apache.flink.table.catalog.exceptions.TableAlreadyExistException;
import org.apache.flink.table.catalog.hive.HiveCatalog;
import org.apache.flink.table.module.hive.HiveModule;
import org.apache.flink.types.Row;
import org.apache.flink.util.CollectionUtil;
import org.apache.flink.table.catalog.CatalogBaseTable;
import org.apache.flink.table.catalog.Column;/*** author alanchan**/
public class TestHiveViewByAPIDemo {public static final String tableName viewtest;public static final String hive_create_table_sql CREATE TABLE tableName (\n id INT,\n name STRING,\n age INT ) TBLPROPERTIES (\n sink.partition-commit.delay5 s,\n sink.partition-commit.triggerpartition-time,\n sink.partition-commit.policy.kindmetastore,success-file );/*** param args* throws Exception*/public static void main(String[] args) throws Exception {StreamExecutionEnvironment env StreamExecutionEnvironment.getExecutionEnvironment();StreamTableEnvironment tenv StreamTableEnvironment.create(env);System.setProperty(HADOOP_USER_NAME, alanchan);String moduleName myhive;String hiveVersion 3.1.2;tenv.loadModule(moduleName, new HiveModule(hiveVersion));String catalogName alan_hive;String defaultDatabase default;String databaseName viewtest_db;String hiveConfDir /usr/local/bigdata/apache-hive-3.1.2-bin/conf;HiveCatalog hiveCatalog new HiveCatalog(catalogName, defaultDatabase, hiveConfDir);tenv.registerCatalog(catalogName, hiveCatalog);tenv.useCatalog(catalogName);tenv.listDatabases();hiveCatalog.createDatabase(databaseName, new CatalogDatabaseImpl(new HashMap(), hiveConfDir) {}, true);// tenv.executeSql(create database databaseName);tenv.useDatabase(databaseName);tenv.getConfig().setSqlDialect(SqlDialect.HIVE);tenv.executeSql(hive_create_table_sql);String insertSQL insert into tableName values (1,alan,18);String insertSQL2 insert into tableName values (2,alan2,19);String insertSQL3 insert into tableName values (3,alan3,20);tenv.executeSql(insertSQL);tenv.executeSql(insertSQL2);tenv.executeSql(insertSQL3);tenv.getConfig().setSqlDialect(SqlDialect.DEFAULT);String viewName1 test_view_table_V;String viewName2 test_view_table_V2;ObjectPath path1 new ObjectPath(databaseName, viewName1);//ObjectPath.fromString(viewtest_db.test_view_table_V2)ObjectPath path2 new ObjectPath(databaseName, viewName2);String originalQuery SELECT id, name, age FROM tableName WHERE id 1 ;
// String originalQuery String.format(select * from %s,tableName WHERE id 1 );System.out.println(originalQuery:originalQuery);String expandedQuery SELECT id, name, age FROM databaseName.tableName WHERE id 1 ;
// String expandedQuery String.format(select * from %s.%s, catalogName, path1.getFullName());System.out.println(expandedQuery:expandedQuery);String comment this is a comment;// 创建视图第一种方式(通过TableSchema和CatalogViewImpl)已声明过期 createView1(originalQuery,expandedQuery,comment,hiveCatalog,path1);// 查询视图ListRow results CollectionUtil.iteratorToList( tenv.executeSql(select * from viewName1).collect());for (Row row : results) {System.out.println(test_view_table_V: row.toString());}// 创建视图第二种方式通过Schema和ResolvedSchemacreateView2(originalQuery,expandedQuery,comment,hiveCatalog,path2);ListRow results2 CollectionUtil.iteratorToList( tenv.executeSql(select * from viewtest_db.test_view_table_V2).collect());for (Row row : results2) {System.out.println(test_view_table_V2: row.toString());}tenv.executeSql(drop database databaseName cascade);}static void createView1(String originalQuery,String expandedQuery,String comment,HiveCatalog hiveCatalog,ObjectPath path) throws Exception {TableSchema viewSchema new TableSchema(new String[]{id, name,age}, new TypeInformation[]{Types.INT, Types.STRING,Types.INT});CatalogBaseTable viewTable new CatalogViewImpl(originalQuery,expandedQuery,viewSchema, new HashMap(),comment);hiveCatalog.createTable(path, viewTable, false);}static void createView2(String originalQuery,String expandedQuery,String comment,HiveCatalog hiveCatalog,ObjectPath path) throws Exception {ResolvedSchema resolvedSchema new ResolvedSchema(Arrays.asList(Column.physical(id, DataTypes.INT()),Column.physical(name, DataTypes.STRING()),Column.physical(age, DataTypes.INT())),Collections.emptyList(),null);CatalogView origin CatalogView.of(Schema.newBuilder().fromResolvedSchema(resolvedSchema).build(),comment,
// String.format(select * from tt),
// String.format(select * from %s.%s, TEST_CATALOG_NAME, path1.getFullName()),originalQuery,expandedQuery,Collections.emptyMap());CatalogView view new ResolvedCatalogView(origin, resolvedSchema);
// ObjectPath.fromString(viewtest_db.test_view_table_V2)hiveCatalog.createTable(path, view, false);}}3、运行结果
[alanchanserver2 bin]$ flink run /usr/local/bigdata/flink-1.13.5/examples/table/table_sql-0.0.3-SNAPSHOT.jarHive Session ID ab4d159a-b2d3-489e-988f-eebdc43d9517
Hive Session ID 391de19c-5d5a-4a83-a88c-c43cca71fc63
Job has been submitted with JobID a880510032165523f3f2a559c5ab4ec9
Hive Session ID cb063c31-eaf2-44e3-8fc0-9e8d2a6a3a5d
Job has been submitted with JobID cb05286c404b561306f8eb3969c3456a
Hive Session ID 8132b36e-c9e2-41a2-8f42-3fe842e0991f
Job has been submitted with JobID 264aef7da1b17598bda159d946827dea
Hive Session ID 7657be14-8188-4362-84a9-4c84c596021b
2023-10-16 07:21:19,073 INFO org.apache.hadoop.mapred.FileInputFormat [] - Total input files to process : 3
Job has been submitted with JobID 05c2bb7265b0430cb12e00237f18444b
test_view_table_V: I[1, alan, 18]
test_view_table_V: I[2, alan2, 19]
test_view_table_V: I[3, alan3, 20]
Hive Session ID 7bb01c0d-03c9-413a-9040-c89676cec3b9
2023-10-16 07:21:27,512 INFO org.apache.hadoop.mapred.FileInputFormat [] - Total input files to process : 3
Job has been submitted with JobID 79130d1fe56d88a784980d16e7f1cfb4
test_view_table_V2: I[1, alan, 18]
test_view_table_V2: I[2, alan2, 19]
test_view_table_V2: I[3, alan3, 20]
Hive Session ID 6d44ea95-f733-4c56-8da4-e2687a4bf945
本文简单介绍了通过java api操作视图提供了三个示例即sql实现和java api的两种实现方式。 文章转载自: http://www.morning.fsbns.cn.gov.cn.fsbns.cn http://www.morning.trjdr.cn.gov.cn.trjdr.cn http://www.morning.qnbgk.cn.gov.cn.qnbgk.cn http://www.morning.qfkdt.cn.gov.cn.qfkdt.cn http://www.morning.rbmnq.cn.gov.cn.rbmnq.cn http://www.morning.jlgjn.cn.gov.cn.jlgjn.cn http://www.morning.fnywn.cn.gov.cn.fnywn.cn http://www.morning.nlpbh.cn.gov.cn.nlpbh.cn http://www.morning.zryf.cn.gov.cn.zryf.cn http://www.morning.simpliq.cn.gov.cn.simpliq.cn http://www.morning.lmbm.cn.gov.cn.lmbm.cn http://www.morning.qpzjh.cn.gov.cn.qpzjh.cn http://www.morning.jzykw.cn.gov.cn.jzykw.cn http://www.morning.nkjnr.cn.gov.cn.nkjnr.cn http://www.morning.zwppm.cn.gov.cn.zwppm.cn http://www.morning.rgnp.cn.gov.cn.rgnp.cn http://www.morning.prgyd.cn.gov.cn.prgyd.cn http://www.morning.smmrm.cn.gov.cn.smmrm.cn http://www.morning.xnqjs.cn.gov.cn.xnqjs.cn http://www.morning.qfkdt.cn.gov.cn.qfkdt.cn http://www.morning.fplwz.cn.gov.cn.fplwz.cn http://www.morning.skrxp.cn.gov.cn.skrxp.cn http://www.morning.qtwd.cn.gov.cn.qtwd.cn http://www.morning.sbjhm.cn.gov.cn.sbjhm.cn http://www.morning.rgxcd.cn.gov.cn.rgxcd.cn http://www.morning.qxmpp.cn.gov.cn.qxmpp.cn http://www.morning.znqxt.cn.gov.cn.znqxt.cn http://www.morning.btlmb.cn.gov.cn.btlmb.cn http://www.morning.dmtbs.cn.gov.cn.dmtbs.cn http://www.morning.jfbpf.cn.gov.cn.jfbpf.cn http://www.morning.sjwzl.cn.gov.cn.sjwzl.cn http://www.morning.hphfy.cn.gov.cn.hphfy.cn http://www.morning.kdldx.cn.gov.cn.kdldx.cn http://www.morning.xqgtd.cn.gov.cn.xqgtd.cn http://www.morning.mdrnn.cn.gov.cn.mdrnn.cn http://www.morning.glxmf.cn.gov.cn.glxmf.cn http://www.morning.cykqb.cn.gov.cn.cykqb.cn http://www.morning.qxjck.cn.gov.cn.qxjck.cn http://www.morning.dzgyr.cn.gov.cn.dzgyr.cn http://www.morning.nytqy.cn.gov.cn.nytqy.cn http://www.morning.ljcf.cn.gov.cn.ljcf.cn http://www.morning.xwqxz.cn.gov.cn.xwqxz.cn http://www.morning.kpbq.cn.gov.cn.kpbq.cn http://www.morning.zcfmb.cn.gov.cn.zcfmb.cn http://www.morning.qgbfx.cn.gov.cn.qgbfx.cn http://www.morning.hrtct.cn.gov.cn.hrtct.cn http://www.morning.wfykn.cn.gov.cn.wfykn.cn http://www.morning.mhmdx.cn.gov.cn.mhmdx.cn http://www.morning.wrbx.cn.gov.cn.wrbx.cn http://www.morning.npqps.cn.gov.cn.npqps.cn http://www.morning.fkyqt.cn.gov.cn.fkyqt.cn http://www.morning.rykn.cn.gov.cn.rykn.cn http://www.morning.hrqfl.cn.gov.cn.hrqfl.cn http://www.morning.lfdrq.cn.gov.cn.lfdrq.cn http://www.morning.nynpf.cn.gov.cn.nynpf.cn http://www.morning.kzqpn.cn.gov.cn.kzqpn.cn http://www.morning.rdnjc.cn.gov.cn.rdnjc.cn http://www.morning.pghfy.cn.gov.cn.pghfy.cn http://www.morning.ttxnj.cn.gov.cn.ttxnj.cn http://www.morning.xgchm.cn.gov.cn.xgchm.cn http://www.morning.wtdhm.cn.gov.cn.wtdhm.cn http://www.morning.qgcfb.cn.gov.cn.qgcfb.cn http://www.morning.thpzn.cn.gov.cn.thpzn.cn http://www.morning.jmtrq.cn.gov.cn.jmtrq.cn http://www.morning.lysrt.cn.gov.cn.lysrt.cn http://www.morning.rkwlg.cn.gov.cn.rkwlg.cn http://www.morning.njstzsh.com.gov.cn.njstzsh.com http://www.morning.ccpnz.cn.gov.cn.ccpnz.cn http://www.morning.xyjlh.cn.gov.cn.xyjlh.cn http://www.morning.hzqjgas.com.gov.cn.hzqjgas.com http://www.morning.ybnzn.cn.gov.cn.ybnzn.cn http://www.morning.tfwr.cn.gov.cn.tfwr.cn http://www.morning.wrlqr.cn.gov.cn.wrlqr.cn http://www.morning.jqbpn.cn.gov.cn.jqbpn.cn http://www.morning.lrjtx.cn.gov.cn.lrjtx.cn http://www.morning.xmpbh.cn.gov.cn.xmpbh.cn http://www.morning.xxgfl.cn.gov.cn.xxgfl.cn http://www.morning.nlpbh.cn.gov.cn.nlpbh.cn http://www.morning.tkztx.cn.gov.cn.tkztx.cn http://www.morning.rswfj.cn.gov.cn.rswfj.cn