当前位置: 首页 > news >正文

室内设计网站 知乎现在个人网站怎么备案

室内设计网站 知乎,现在个人网站怎么备案,双语言网站模版,学设计用什么笔记本电脑好记录一下学习过程#xff0c;得到一个需求是基于Camera2OpenGL ESMediaCodecAudioRecord实现录制音视频。 需求#xff1a; 在每一帧视频数据中#xff0c;写入SEI额外数据#xff0c;方便后期解码时获得每一帧中的自定义数据。点击录制功能后#xff0c;录制的是前N秒至…记录一下学习过程得到一个需求是基于Camera2OpenGL ESMediaCodecAudioRecord实现录制音视频。 需求 在每一帧视频数据中写入SEI额外数据方便后期解码时获得每一帧中的自定义数据。点击录制功能后录制的是前N秒至后N秒这段时间的音视频保存的文件都按照60s进行保存。 写在前面整个学习过程涉及到以下内容可以快速检索是否有想要的内容 MediaCodec的使用采用的是createInputSurface()创建一个surface通过EGL接受camera2传过来的画面。AudioRecord的使用Camera2的使用OpenGL的简单使用H264 SEI的写入简单例子 整体思路设计比较简单打开相机创建OpenGL相关环境然后创建video线程录制video相关数据创建audio线程录制audio相关数据video和audio数据都存在自定义的List中作为缓存最后使用一个编码线程将video List和audio List中的数据编码到MP4中即可。用的安卓sdk 28因为29以上保存比较麻烦。整个工程暂时没上传有需要私。 将以上功能都模块化分别写到不同的类中。先介绍一些独立的模块。 UI布局 ui很简单一个GLSurfaceView两个button控件。 ?xml version1.0 encodingutf-8? androidx.constraintlayout.widget.ConstraintLayout xmlns:androidhttp://schemas.android.com/apk/res/androidxmlns:apphttp://schemas.android.com/apk/res-autoxmlns:toolshttp://schemas.android.com/toolsandroid:layout_widthmatch_parentandroid:layout_heightmatch_parenttools:context.MainActivityandroid.opengl.GLSurfaceViewandroid:idid/glViewandroid:layout_widthmatch_parentandroid:layout_heightmatch_parentapp:layout_constraintBottom_toBottomOfparentapp:layout_constraintEnd_toEndOfparentapp:layout_constraintStart_toStartOfparentapp:layout_constraintTop_toTopOfparent /Buttonandroid:idid/recordBtnandroid:layout_widthwrap_contentandroid:layout_heightwrap_contentandroid:layout_marginBottom80dpandroid:textRecordapp:layout_constraintBottom_toBottomOfparentapp:layout_constraintLeft_toLeftOfparentapp:layout_constraintRight_toRightOfparent /Buttonandroid:idid/exitandroid:layout_widthwrap_contentandroid:layout_heightwrap_contentandroid:layout_marginTop20dpandroid:layout_marginRight20dpandroid:textEixtapp:layout_constraintTop_toTopOfparentapp:layout_constraintRight_toRightOfparent / /androidx.constraintlayout.widget.ConstraintLayoutCamera2 camera2框架的使用比较简单需要注意的一点是, startPreview函数中传入的surface用于后续mCaptureRequestBuilder.addTarget(surface)的参数传入。surface的产生由以下基本几步完成。现在简单提一下下面会贴代码。 1.这个surface 就是通过openGL 生成的纹理 GLES30.glGenTextures(1, mTexture, 0); 2.纹理生成SurfaceTexture mSurfaceTexture new SurfaceTexture(mTexture[0]); 3.mSurfaceTexture生成一个surface, mSurface new Surface(mSurfaceTexture); 4.mCamera.startPreview(mSurface); public class Camera2 {private final String TAG Abbott Camera2;private Context mContext;private CameraManager mCameraManager;private CameraDevice mCameraDevice;private String[] mCamList;private String mCameraId;private Size mPreviewSize;private HandlerThread mBackgroundThread;private Handler mBackgroundHandler;private CaptureRequest.Builder mCaptureRequestBuilder;private CaptureRequest mCaptureRequest;private CameraCaptureSession mCameraCaptureSession;public Camera2(Context Context) {mContext Context;mCameraManager (CameraManager) mContext.getSystemService(android.content.Context.CAMERA_SERVICE);try {mCamList mCameraManager.getCameraIdList();} catch (CameraAccessException e) {e.printStackTrace();}mBackgroundThread new HandlerThread(CameraThread);mBackgroundThread.start();mBackgroundHandler new Handler(mBackgroundThread.getLooper());}public void openCamera(int width, int height, String id) {try {Log.d(TAG, openCamera: id: id);CameraCharacteristics characteristics mCameraManager.getCameraCharacteristics(id);if (characteristics.get(CameraCharacteristics.LENS_FACING) CameraCharacteristics.LENS_FACING_FRONT) {}StreamConfigurationMap map characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);mPreviewSize getOptimalSize(map.getOutputSizes(SurfaceTexture.class), width, height);mCameraId id;} catch (CameraAccessException e) {e.printStackTrace();}try {if (ActivityCompat.checkSelfPermission(mContext, android.Manifest.permission.CAMERA) ! PackageManager.PERMISSION_GRANTED) {return;}Log.d(TAG, mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);: mCameraId);mCameraManager.openCamera(mCameraId, mStateCallback, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}private Size getOptimalSize(Size[] sizeMap, int width, int height) {ListSize sizeList new ArrayList();for (Size option : sizeMap) {if (width height) {if (option.getWidth() width option.getHeight() height) {sizeList.add(option);}} else {if (option.getWidth() height option.getHeight() width) {sizeList.add(option);}}}if (sizeList.size() 0) {return Collections.min(sizeList, new ComparatorSize() {Overridepublic int compare(Size lhs, Size rhs) {return Long.signum((long) lhs.getWidth() * lhs.getHeight() - (long) rhs.getWidth() * rhs.getHeight());}});}return sizeMap[0];}private final CameraDevice.StateCallback mStateCallback new CameraDevice.StateCallback() {Overridepublic void onOpened(NonNull CameraDevice camera) {mCameraDevice camera;}Overridepublic void onDisconnected(NonNull CameraDevice camera) {camera.close();mCameraDevice null;}Overridepublic void onError(NonNull CameraDevice camera, int error) {camera.close();mCameraDevice null;}};public void startPreview(Surface surface) {try {mCaptureRequestBuilder mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);mCaptureRequestBuilder.addTarget(surface);mCameraDevice.createCaptureSession(Collections.singletonList(surface), new CameraCaptureSession.StateCallback() {Overridepublic void onConfigured(NonNull CameraCaptureSession session) {try {mCaptureRequest mCaptureRequestBuilder.build();mCameraCaptureSession session;mCameraCaptureSession.setRepeatingRequest(mCaptureRequest, null, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}}Overridepublic void onConfigureFailed(NonNull CameraCaptureSession session) {}}, mBackgroundHandler);} catch (CameraAccessException e) {e.printStackTrace();}} }ImageList 这个类就是用于video 和audio缓存类没有什么可以介绍的直接用就好了。 public class ImageList {private static final String TAG Abbott ImageList;private Object mImageListLock new Object();int kCapacity;private ListImageItem mImageList new CopyOnWriteArrayList();public ImageList(int capacity) {kCapacity capacity;}public synchronized void addItem(long Timestamp, ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo) {synchronized (mImageListLock) {ImageItem item new ImageItem(Timestamp, byteBuffer, bufferInfo);mImageList.add(item);if (mImageList.size() kCapacity) {int excessItems mImageList.size() - kCapacity;mImageList.subList(0, excessItems).clear();}}}public synchronized ListImageItem getItemsInTimeRange(long startTimestamp, long endTimestamp) {ListImageItem itemsInTimeRange new ArrayList();synchronized (mImageListLock) {for (ImageItem item : mImageList) {long itemTimestamp item.getTimestamp();// 判断时间戳是否在指定范围内if (itemTimestamp startTimestamp itemTimestamp endTimestamp) {itemsInTimeRange.add(item);}}}return itemsInTimeRange;}public synchronized ImageItem getItem() {return mImageList.get(0);}public synchronized void removeItem() {mImageList.remove(0);}public synchronized int getSize() {return mImageList.size();}public static class ImageItem {private long mTimestamp;private ByteBuffer mVideoBuffer;private MediaCodec.BufferInfo mVideoBufferInfo;public ImageItem(long first, ByteBuffer second, MediaCodec.BufferInfo bufferInfo) {this.mTimestamp first;this.mVideoBuffer second;this.mVideoBufferInfo bufferInfo;}public synchronized long getTimestamp() {return mTimestamp;}public synchronized ByteBuffer getVideoByteBuffer() {return mVideoBuffer;}public synchronized MediaCodec.BufferInfo getVideoBufferInfo() {return mVideoBufferInfo;}} }GlProgram 用于创建OpenGL的程序的类。目前使用的是OpenGL3.0 版本 public class GlProgram {public static final String mVertexShader #version 300 es \n in vec4 vPosition; in vec2 vCoordinate; out vec2 vTextureCoordinate; void main() { gl_Position vPosition; vTextureCoordinate vCoordinate; };public static final String mFragmentShader #version 300 es \n #extension GL_OES_EGL_image_external : require \n #extension GL_OES_EGL_image_external_essl3 : require \n precision mediump float; in vec2 vTextureCoordinate; uniform samplerExternalOES oesTextureSampler; out vec4 gl_FragColor; void main() { gl_FragColor texture(oesTextureSampler, vTextureCoordinate); };public static int createProgram(String vertexShaderSource, String fragShaderSource) {int program GLES30.glCreateProgram();if (0 program) {Log.e(Arc_ShaderManager, create program error ,error GLES30.glGetError());return 0;}int vertexShader loadShader(GLES30.GL_VERTEX_SHADER, vertexShaderSource);if (0 vertexShader) {return 0;}int fragShader loadShader(GLES30.GL_FRAGMENT_SHADER, fragShaderSource);if (0 fragShader) {return 0;}GLES30.glAttachShader(program, vertexShader);GLES30.glAttachShader(program, fragShader);GLES30.glLinkProgram(program);int[] status new int[1];GLES30.glGetProgramiv(program, GLES30.GL_LINK_STATUS, status, 0);if (GLES30.GL_FALSE status[0]) {String errorMsg GLES30.glGetProgramInfoLog(program);Log.e(Arc_ShaderManager, createProgram error : errorMsg);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);GLES30.glDeleteProgram(program);return 0;}GLES30.glDetachShader(program, vertexShader);GLES30.glDetachShader(program, fragShader);GLES30.glDeleteShader(vertexShader);GLES30.glDeleteShader(fragShader);return program;}private static int loadShader(int type, String shaderSource) {int shader GLES30.glCreateShader(type);if (0 shader) {Log.e(Arc_ShaderManager, create shader error, shader type type , error GLES30.glGetError());return 0;}GLES30.glShaderSource(shader, shaderSource);GLES30.glCompileShader(shader);int[] status new int[1];GLES30.glGetShaderiv(shader, GLES30.GL_COMPILE_STATUS, status, 0);if (0 status[0]) {String errorMsg GLES30.glGetShaderInfoLog(shader);Log.e(Arc_ShaderManager, createShader shader type error: errorMsg);GLES30.glDeleteShader(shader);return 0;}return shader;} }OesTexture 连接上面介绍的OpenGL程序通过顶点着色器和片元着色器的坐标生成纹理 public class OesTexture {private static final String TAG Abbott OesTexture;private int mProgram;private final FloatBuffer mCordsBuffer;private final FloatBuffer mPositionBuffer;private int mPositionHandle;private int mCordsHandle;private int mOESTextureHandle;public OesTexture() {float[] positions {-1.0f, 1.0f,-1.0f, -1.0f,1.0f, 1.0f,1.0f, -1.0f};float[] texCords {0.0f, 0.0f,0.0f, 1.0f,1.0f, 0.0f,1.0f, 1.0f,};mPositionBuffer ByteBuffer.allocateDirect(positions.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mPositionBuffer.put(positions).position(0);mCordsBuffer ByteBuffer.allocateDirect(texCords.length * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();mCordsBuffer.put(texCords).position(0);}public void init() {this.mProgram GlProgram.createProgram(GlProgram.mVertexShader, GlProgram.mFragmentShader);if (0 this.mProgram) {Log.e(TAG, createProgram failed);}mPositionHandle GLES30.glGetAttribLocation(mProgram, vPosition);mCordsHandle GLES30.glGetAttribLocation(mProgram, vCoordinate);mOESTextureHandle GLES30.glGetUniformLocation(mProgram, oesTextureSampler);GLES30.glDisable(GLES30.GL_DEPTH_TEST);}public void PrepareTexture(int OESTextureId) {GLES30.glUseProgram(this.mProgram);GLES30.glEnableVertexAttribArray(mPositionHandle);GLES30.glVertexAttribPointer(mPositionHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mPositionBuffer);GLES30.glEnableVertexAttribArray(mCordsHandle);GLES30.glVertexAttribPointer(mCordsHandle, 2, GLES30.GL_FLOAT, false, 2 * 4, mCordsBuffer);GLES30.glActiveTexture(GLES30.GL_TEXTURE0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, OESTextureId);GLES30.glUniform1i(mOESTextureHandle, 0);GLES30.glDrawArrays(GLES30.GL_TRIANGLE_STRIP, 0, 4);GLES30.glDisableVertexAttribArray(mPositionHandle);GLES30.glDisableVertexAttribArray(mCordsHandle);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);} }接下来介绍的VideoRecorderAudioEncoderEncodingRunnable三个类需要互相搭配使用 public class AudioEncoder extends Thread {private static final String TAG Abbott AudioEncoder;private static final int SAVEMP4_INTERNAL Param.recordInternal * 1000 * 1000;private static final int SAMPLE_RATE 44100;private static final int CHANNEL_COUNT 1;private static final int BIT_RATE 96000;private EncodingRunnable mEncodingRunnable;private MediaCodec mMediaCodec;private AudioRecord mAudioRecord;private MediaFormat mFormat;private MediaFormat mOutputFormat;private long nanoTime;int mBufferSizeInBytes 0;boolean mExitThread true;private ImageList mAudioList;private MediaCodec.BufferInfo mAudioBufferInfo;private boolean mAlarm false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private ListImageList.ImageItem mMuxerImageItem;private Object mLock new Object();private MediaCodec.BufferInfo mAlarmBufferInfo;public AudioEncoder( EncodingRunnable encodingRunnable) throws IOException {mEncodingRunnable encodingRunnable;nanoTime System.nanoTime();createAudio();createMediaCodec();int kCapacity 1000 / 20 * Param.recordInternal;mAudioList new ImageList(kCapacity);}public void createAudio() {mBufferSizeInBytes AudioRecord.getMinBufferSize(SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);mAudioRecord new AudioRecord(MediaRecorder.AudioSource.MIC, SAMPLE_RATE, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, mBufferSizeInBytes);}public void createMediaCodec() throws IOException {mFormat MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, SAMPLE_RATE, CHANNEL_COUNT);mFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);mFormat.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);mFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 8192);mMediaCodec MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);mMediaCodec.configure(mFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, setAudio Alarm enter);mEncodingRunnable.setAudioFormat(mOutputFormat);mEncodingRunnable.setAudioAlarmTrue();mAlarmTime mAlarmBufferInfo.presentationTimeUs;mAlarmEndTime mAlarmTime SAVEMP4_INTERNAL;if (!mAlarm) {mAlarmStartTime mAlarmTime - SAVEMP4_INTERNAL;}mAlarm true;Log.d(TAG, setAudio Alarm exit);}}Overridepublic void run() {super.run();mMediaCodec.start();mAudioRecord.startRecording();while (mExitThread) {synchronized (mLock) {byte[] inputAudioData new byte[mBufferSizeInBytes];int res mAudioRecord.read(inputAudioData, 0, inputAudioData.length);if (res 0) {if (mAudioRecord ! null) {enCodeAudio(inputAudioData);}}}}Log.d(TAG, AudioRecord run: exit);}private void enCodeAudio(byte[] inputAudioData) {mAudioBufferInfo new MediaCodec.BufferInfo();int index mMediaCodec.dequeueInputBuffer(-1);if (index 0) {return;}ByteBuffer[] inputBuffers mMediaCodec.getInputBuffers();ByteBuffer audioInputBuffer inputBuffers[index];audioInputBuffer.clear();audioInputBuffer.put(inputAudioData);audioInputBuffer.limit(inputAudioData.length);mMediaCodec.queueInputBuffer(index, 0, inputAudioData.length, (System.nanoTime() - nanoTime) / 1000, 0);int status mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);ByteBuffer outputBuffer;if (status MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat mMediaCodec.getOutputFormat();} else {while (status 0) {MediaCodec.BufferInfo tmpaudioBufferInfo new MediaCodec.BufferInfo();tmpaudioBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);mAlarmBufferInfo new MediaCodec.BufferInfo();mAlarmBufferInfo.set(mAudioBufferInfo.offset, mAudioBufferInfo.size, mAudioBufferInfo.presentationTimeUs, mAudioBufferInfo.flags);outputBuffer mMediaCodec.getOutputBuffer(status);ByteBuffer buffer ByteBuffer.allocate(tmpaudioBufferInfo.size);buffer.limit(tmpaudioBufferInfo.size);buffer.put(outputBuffer);buffer.flip();if (tmpaudioBufferInfo.size 0) {if (mAlarm) {mMuxerImageItem mAudioList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);for (ImageList.ImageItem item : mMuxerImageItem) {mEncodingRunnable.pushAudio(item);}mAlarmStartTime tmpaudioBufferInfo.presentationTimeUs;mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);if (tmpaudioBufferInfo.presentationTimeUs - mAlarmTime SAVEMP4_INTERNAL) {mAlarm false;mEncodingRunnable.setAudioAlarmFalse();Log.d(TAG, mEncodingRunnable.setAudio itemAlarmFalse(););}} else {mAudioList.addItem(tmpaudioBufferInfo.presentationTimeUs, buffer, tmpaudioBufferInfo);}}mMediaCodec.releaseOutputBuffer(status, false);status mMediaCodec.dequeueOutputBuffer(mAudioBufferInfo, 0);}}}public synchronized void stopAudioRecord() throws IllegalStateException {synchronized (mLock) {mExitThread false;}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.stop();mMediaCodec.release();mMediaCodec null;} }public class VideoRecorder extends Thread {private static final String TAG Abbott VideoRecorder;private static final int SAVE_MP4_Internal 1000 * 1000 * Param.recordInternal;// EGLprivate static final int EGL_RECORDABLE_ANDROID 0x3142;private EGLContext mEGLContext EGL14.EGL_NO_CONTEXT;private EGLDisplay mEGLDisplay EGL14.EGL_NO_DISPLAY;private EGLSurface mEGLSurface EGL14.EGL_NO_SURFACE;private EGLContext mSharedContext EGL14.EGL_NO_CONTEXT;private Surface mSurface;private int mOESTextureId;private OesTexture mOesTexture;private ImageList mImageList;private ListImageList.ImageItem muxerImageItem;// Threadprivate boolean mExitThread;private Object mLock new Object();private Object object new Object();private MediaCodec mMediaCodec;private MediaFormat mOutputFormat;private boolean mAlarm false;private long mAlarmTime;private long mAlarmStartTime;private long mAlarmEndTime;private MediaCodec.BufferInfo mBufferInfo;private EncodingRunnable mEncodingRunnable;private String mSeiMessage;public VideoRecorder(EGLContext eglContext, EncodingRunnable encodingRunnable) {mSharedContext eglContext;mEncodingRunnable encodingRunnable;int kCapacity 1000 / 40 * Param.recordInternal;mImageList new ImageList(kCapacity);try {MediaFormat mediaFormat MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 1920, 1080);mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1920 * 1080 * 25 / 5);mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);mMediaCodec MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);mSurface mMediaCodec.createInputSurface();} catch (IOException e) {e.printStackTrace();}}Overridepublic void run() {super.run();try {initEgl();mOesTexture new OesTexture();mOesTexture.init();synchronized (mLock) {mLock.wait(33);}guardedRun();} catch (Exception e) {e.printStackTrace();}}private void guardedRun() throws InterruptedException, RuntimeException {mExitThread false;while (true) {synchronized (mLock) {if (mExitThread) {break;}mLock.wait(33);}mOesTexture.PrepareTexture(mOESTextureId);swapBuffers();enCodeVideo();}Log.d(TAG, guardedRun: exit);unInitEgl();}private void enCodeVideo() {mBufferInfo new MediaCodec.BufferInfo();int status mMediaCodec.dequeueOutputBuffer(mBufferInfo, 0);ByteBuffer outputBuffer null;if (status MediaCodec.INFO_TRY_AGAIN_LATER) {} else if (status MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {mOutputFormat mMediaCodec.getOutputFormat();} else if (status MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {} else {outputBuffer mMediaCodec.getOutputBuffer(status);if ((mBufferInfo.flags MediaCodec.BUFFER_FLAG_CODEC_CONFIG) ! 0) {mBufferInfo.size 0;}if (mBufferInfo.size 0) {outputBuffer.position(mBufferInfo.offset);outputBuffer.limit(mBufferInfo.size - mBufferInfo.offset);mSeiMessage avcIndex String.format(%05d, 0);}mMediaCodec.releaseOutputBuffer(status, false);}if (mBufferInfo.size 0) {mEncodingRunnable.setTimeUs(mBufferInfo.presentationTimeUs);ByteBuffer seiData buildSEIData(mSeiMessage);ByteBuffer frameWithSEI ByteBuffer.allocate(outputBuffer.remaining() seiData.remaining());frameWithSEI.put(seiData);frameWithSEI.put(outputBuffer);frameWithSEI.flip();mBufferInfo.size frameWithSEI.remaining();MediaCodec.BufferInfo tmpAudioBufferInfo new MediaCodec.BufferInfo();tmpAudioBufferInfo.set(mBufferInfo.offset, mBufferInfo.size, mBufferInfo.presentationTimeUs, mBufferInfo.flags);if (mAlarm) {muxerImageItem mImageList.getItemsInTimeRange(mAlarmStartTime, mAlarmEndTime);mAlarmStartTime tmpAudioBufferInfo.presentationTimeUs;for (ImageList.ImageItem item : muxerImageItem) {mEncodingRunnable.push(item);}mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);if (mBufferInfo.presentationTimeUs - mAlarmTime SAVE_MP4_Internal) {Log.d(TAG, mEncodingRunnable.set itemAlarmFalse());Log.d(TAG, tmpAudioBufferInfo.presentationTimeUs mAlarmTime);mAlarm false;mEncodingRunnable.setVideoAlarmFalse();}} else {mImageList.addItem(tmpAudioBufferInfo.presentationTimeUs, frameWithSEI, tmpAudioBufferInfo);}}}public synchronized void setAlarm() {synchronized (mLock) {Log.d(TAG, setAlarm enter);mEncodingRunnable.setMediaFormat(mOutputFormat);mEncodingRunnable.setVideoAlarmTrue();if (mBufferInfo.presentationTimeUs ! 0) {mAlarmTime mBufferInfo.presentationTimeUs;}mAlarmEndTime mAlarmTime SAVE_MP4_Internal;if (!mAlarm) {mAlarmStartTime mAlarmTime - SAVE_MP4_Internal;}mAlarm true;Log.d(TAG, setAlarm exit);}}public synchronized void startRecord() throws IllegalStateException {super.start();mMediaCodec.start();}public synchronized void stopVideoRecord() throws IllegalStateException {synchronized (mLock) {mExitThread true;mLock.notify();}try {join();} catch (InterruptedException e) {e.printStackTrace();}mMediaCodec.signalEndOfInputStream();mMediaCodec.stop();mMediaCodec.release();mMediaCodec null;}public void requestRender(int i) {synchronized (object) {mOESTextureId i;}}private void initEgl() {this.mEGLDisplay EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);if (this.mEGLDisplay EGL14.EGL_NO_DISPLAY) {throw new RuntimeException(EGL14.eglGetDisplay fail...);}int[] major_version new int[2];boolean eglInited EGL14.eglInitialize(this.mEGLDisplay, major_version, 0, major_version, 1);if (!eglInited) {this.mEGLDisplay null;throw new RuntimeException(EGL14.eglInitialize fail...);}//4. 设置显示设备的属性int[] attrib_list new int[]{EGL14.EGL_SURFACE_TYPE, EGL14.EGL_WINDOW_BIT,EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,EGL14.EGL_RED_SIZE, 8,EGL14.EGL_GREEN_SIZE, 8,EGL14.EGL_BLUE_SIZE, 8,EGL14.EGL_ALPHA_SIZE, 8,EGL14.EGL_DEPTH_SIZE, 16,EGL_RECORDABLE_ANDROID, 1,EGL14.EGL_NONE};EGLConfig[] configs new EGLConfig[1];int[] numConfigs new int[1];boolean eglChose EGL14.eglChooseConfig(this.mEGLDisplay, attrib_list, 0, configs, 0, configs.length, numConfigs, 0);if (!eglChose) {throw new RuntimeException(eglChooseConfig [RGBA888 recordable] ES2 EGL_config_fail...);}int[] attr_list {EGL14.EGL_CONTEXT_CLIENT_VERSION, 2, EGL14.EGL_NONE};this.mEGLContext EGL14.eglCreateContext(this.mEGLDisplay, configs[0], this.mSharedContext, attr_list, 0);checkEglError(eglCreateContext);if (this.mEGLContext EGL14.EGL_NO_CONTEXT) {throw new RuntimeException(eglCreateContext EGL_NO_CONTEXT);}int[] surface_attr {EGL14.EGL_NONE};this.mEGLSurface EGL14.eglCreateWindowSurface(this.mEGLDisplay, configs[0], this.mSurface, surface_attr, 0);if (this.mEGLSurface EGL14.EGL_NO_SURFACE) {throw new RuntimeException(eglCreateWindowSurface EGL_NO_SURFACE);}Log.d(TAG, initEgl , display this.mEGLDisplay ,context this.mEGLContext ,sharedContext this.mSharedContext , surface this.mEGLSurface);boolean success EGL14.eglMakeCurrent(this.mEGLDisplay, this.mEGLSurface, this.mEGLSurface, this.mEGLContext);if (!success) {checkEglError(makeCurrent);throw new RuntimeException(eglMakeCurrent failed);}}private void unInitEgl() {boolean success EGL14.eglMakeCurrent(mEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_CONTEXT);if (!success) {checkEglError(makeCurrent);throw new RuntimeException(eglMakeCurrent failed);}if (this.mEGLDisplay ! EGL14.EGL_NO_DISPLAY) {EGL14.eglDestroySurface(this.mEGLDisplay, this.mEGLSurface);EGL14.eglDestroyContext(this.mEGLDisplay, this.mEGLContext);EGL14.eglTerminate(this.mEGLDisplay);}this.mEGLDisplay EGL14.EGL_NO_DISPLAY;this.mEGLContext EGL14.EGL_NO_CONTEXT;this.mEGLSurface EGL14.EGL_NO_SURFACE;this.mSharedContext EGL14.EGL_NO_CONTEXT;this.mSurface null;}private boolean swapBuffers() {if ((null this.mEGLDisplay) || (null this.mEGLSurface)) {return false;}boolean success EGL14.eglSwapBuffers(this.mEGLDisplay, this.mEGLSurface);if (!success) {checkEglError(eglSwapBuffers);}return success;}private void checkEglError(String msg) {int error EGL14.eglGetError();if (error ! EGL14.EGL_SUCCESS) {throw new RuntimeException(msg : EGL_ERROR_CODE: 0x Integer.toHexString(error));}}private ByteBuffer buildSEIData(String message) {// 构建 SEI 数据int seiSize 128;ByteBuffer seiBuffer ByteBuffer.allocate(seiSize);seiBuffer.put(new byte[]{0, 0, 0, 1, 6, 5});// 设置 SEI messageString seiMessage h264testdata message;seiBuffer.put((byte) seiMessage.length());// 设置 SEI user dataseiBuffer.put(seiMessage.getBytes());seiBuffer.flip();return seiBuffer;}}public class EncodingRunnable extends Thread {private static final String TAG Abbott EncodingRunnable;private Object mRecordLock new Object();private boolean mExitThread false;private MediaMuxer mMediaMuxer;private int avcIndex;private int mAudioIndex;private MediaFormat mOutputFormat;private MediaFormat mAudioOutputFormat;private ImageList mImageList;private ImageList mAudioImageList;private boolean itemAlarm;private long mAudioImageListTimeUs -1;private boolean mAudioAlarm;private int mVideoCapcity 1000 / 40 * Param.recordInternal;private int mAudioCapcity 1000 / 20 * Param.recordInternal;private int recordSecond 1000 * 1000 * 60;long Video60sStart -1;public EncodingRunnable() {mImageList new ImageList(mVideoCapcity);mAudioImageList new ImageList(mAudioCapcity);}private boolean mIsRecoding false;public void setMediaFormat(MediaFormat OutputFormat) {if (mOutputFormat null) {mOutputFormat OutputFormat;}}public void setAudioFormat(MediaFormat OutputFormat) {if (mAudioOutputFormat null) {mAudioOutputFormat OutputFormat;}}public void setMediaMuxerConfig() {long currentTimeMillis System.currentTimeMillis();Date currentDate new Date(currentTimeMillis);SimpleDateFormat dateFormat new SimpleDateFormat(yyyyMMdd_HHmmss, Locale.getDefault());String fileName dateFormat.format(currentDate);File mFile new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),fileName .MP4);Log.d(TAG, setMediaMuxerSavaPath: new MediaMuxer mFile.getPath());try {mMediaMuxer new MediaMuxer(mFile.getPath(), MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);} catch (IOException e) {e.printStackTrace();}avcIndex mMediaMuxer.addTrack(mOutputFormat);mAudioIndex mMediaMuxer.addTrack(mAudioOutputFormat);mMediaMuxer.start();}public void setMediaMuxerSavaPath() {if (!mIsRecoding) {mExitThread false;setMediaMuxerConfig();setRecording();notifyStartRecord();}}Overridepublic void run() {super.run();while (true) {synchronized (mRecordLock) {try {mRecordLock.wait();} catch (InterruptedException e) {e.printStackTrace();}}MediaCodec.BufferInfo tmpAudioBufferInfo new MediaCodec.BufferInfo();while (mIsRecoding) {if (mAudioImageList.getSize() 0) {ImageList.ImageItem audioItem mAudioImageList.getItem();tmpAudioBufferInfo.set(audioItem.getVideoBufferInfo().offset,audioItem.getVideoBufferInfo().size,audioItem.getVideoBufferInfo().presentationTimeUs mAudioImageListTimeUs,audioItem.getVideoBufferInfo().flags);mMediaMuxer.writeSampleData(mAudioIndex, audioItem.getVideoByteBuffer(), tmpAudioBufferInfo);mAudioImageList.removeItem();}if (mImageList.getSize() 0) {ImageList.ImageItem item mImageList.getItem();if (Video60sStart 0) {Video60sStart item.getVideoBufferInfo().presentationTimeUs;}mMediaMuxer.writeSampleData(avcIndex, item.getVideoByteBuffer(), item.getVideoBufferInfo());if (item.getVideoBufferInfo().presentationTimeUs - Video60sStart recordSecond) {Log.d(TAG, System.currentTimeMillis() - Video60sStart : (item.getVideoBufferInfo().presentationTimeUs - Video60sStart));mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer null;setMediaMuxerConfig();Video60sStart -1;}mImageList.removeItem();}if (itemAlarm false mAudioAlarm false) {mIsRecoding false;Log.d(TAG, mediaMuxer.stop());mMediaMuxer.stop();mMediaMuxer.release();mMediaMuxer null;break;}}if (mExitThread) {break;}}}public synchronized void setRecording() throws IllegalStateException {synchronized (mRecordLock) {mIsRecoding true;}}public synchronized void setAudioAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm true;}}public synchronized void setVideoAlarmTrue() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm true;}}public synchronized void setAudioAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {mAudioAlarm false;}}public synchronized void setVideoAlarmFalse() throws IllegalStateException {synchronized (mRecordLock) {itemAlarm false;}}public synchronized void notifyStartRecord() throws IllegalStateException {synchronized (mRecordLock) {mRecordLock.notify();}}public synchronized void push(ImageList.ImageItem item) {mImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}public synchronized void pushAudio(ImageList.ImageItem item) {synchronized (mRecordLock) {mAudioImageList.addItem(item.getTimestamp(),item.getVideoByteBuffer(),item.getVideoBufferInfo());}}public synchronized void setTimeUs(long l) {if (mAudioImageListTimeUs ! -1) {return;}mAudioImageListTimeUs l;Log.d(TAG, setTimeUs: l);}public synchronized void setExitThread() {mExitThread true;mIsRecoding false;notifyStartRecord();try {join();} catch (InterruptedException e) {e.printStackTrace();}}}最后介绍一下Camera2Renderer和MainActivity Camera2Renderer Camera2Renderer继承GLSurfaceView.Renderer通过这个类来调动所有的代码。 public class Camera2Renderer implements GLSurfaceView.Renderer {private static final String TAG Abbott Camera2Renderer;final private Context mContext;final private GLSurfaceView mGlSurfaceView;private Camera2 mCamera;private int[] mTexture new int[1];private SurfaceTexture mSurfaceTexture;private Surface mSurface;private OesTexture mOesTexture;private EGLContext mEglContext null;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;public Camera2Renderer(Context context, GLSurfaceView glSurfaceView, EncodingRunnable encodingRunnable) {mContext context;mGlSurfaceView glSurfaceView;mEncodingRunnable encodingRunnable;}Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) {mCamera new Camera2(mContext);mCamera.openCamera(1920, 1080, 0);mOesTexture new OesTexture();mOesTexture.init();mEglContext EGL14.eglGetCurrentContext();mVideoRecorder new VideoRecorder(mEglContext, mEncodingRunnable);mVideoRecorder.startRecord();try {mAudioEncoder new AudioEncoder(mEncodingRunnable);mAudioEncoder.start();} catch (IOException e) {e.printStackTrace();}}Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {GLES30.glGenTextures(1, mTexture, 0);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTexture[0]);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_NEAREST);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);GLES30.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);mSurfaceTexture new SurfaceTexture(mTexture[0]);mSurfaceTexture.setDefaultBufferSize(1920, 1080);mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {Overridepublic void onFrameAvailable(SurfaceTexture surfaceTexture) {mGlSurfaceView.requestRender();}});mSurface new Surface(mSurfaceTexture);mCamera.startPreview(mSurface);}Overridepublic void onDrawFrame(GL10 gl) {mSurfaceTexture.updateTexImage();mOesTexture.PrepareTexture(mTexture[0]);mVideoRecorder.requestRender(mTexture[0]);}public VideoRecorder getVideoRecorder() {return mVideoRecorder;}public AudioEncoder getAudioEncoder() {return mAudioEncoder;} }主函数比较简单就是申请权限而已。 public class MainActivity extends AppCompatActivity {private static final String TAG Abbott MainActivity;private static final String FRAGMENT_DIALOG dialog;private final Object mLock new Object();private GLSurfaceView mGlSurfaceView;private Button mRecordButton;private Button mExitButton;private Camera2Renderer mCamera2Renderer;private VideoRecorder mVideoRecorder;private EncodingRunnable mEncodingRunnable;private AudioEncoder mAudioEncoder;private static final int REQUEST_CAMERA_PERMISSION 1;Overrideprotected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) ! PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) ! PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) ! PackageManager.PERMISSION_GRANTED|| ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) ! PackageManager.PERMISSION_GRANTED) {requestCameraPermission();return;}setContentView(R.layout.activity_main);mGlSurfaceView findViewById(R.id.glView);mRecordButton findViewById(R.id.recordBtn);mExitButton findViewById(R.id.exit);mGlSurfaceView.setEGLContextClientVersion(3);mEncodingRunnable new EncodingRunnable();mEncodingRunnable.start();mCamera2Renderer new Camera2Renderer(this, mGlSurfaceView, mEncodingRunnable);mGlSurfaceView.setRenderer(mCamera2Renderer);mGlSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);}Overrideprotected void onResume() {super.onResume();mRecordButton.setOnClickListener(new View.OnClickListener() {Overridepublic void onClick(View view) {synchronized (MainActivity.this) {startRecord();}}});mExitButton.setOnClickListener(new View.OnClickListener() {Overridepublic void onClick(View view) {stopRecord();Log.d(TAG, onClick: exit program);finish();}});}private void requestCameraPermission() {if (shouldShowRequestPermissionRationale(Manifest.permission.CAMERA) ||shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE) ||shouldShowRequestPermissionRationale(Manifest.permission.RECORD_AUDIO)) {new ConfirmationDialog().show(getSupportFragmentManager(), FRAGMENT_DIALOG);} else {requestPermissions(new String[]{Manifest.permission.CAMERA,Manifest.permission.WRITE_EXTERNAL_STORAGE,Manifest.permission.RECORD_AUDIO}, REQUEST_CAMERA_PERMISSION);}}public static class ConfirmationDialog extends DialogFragment {NonNullOverridepublic Dialog onCreateDialog(Bundle savedInstanceState) {final Fragment parent getParentFragment();return new AlertDialog.Builder(getActivity()).setMessage(R.string.request_permission).setPositiveButton(android.R.string.ok, new DialogInterface.OnClickListener() {Overridepublic void onClick(DialogInterface dialog, int which) {}}).setNegativeButton(android.R.string.cancel,new DialogInterface.OnClickListener() {Overridepublic void onClick(DialogInterface dialog, int which) {Activity activity parent.getActivity();if (activity ! null) {activity.finish();}}}).create();}}private void startRecord() {synchronized (mLock) {try {if (mVideoRecorder null) {mVideoRecorder mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder null) {mAudioEncoder mCamera2Renderer.getAudioEncoder();}mVideoRecorder.setAlarm();mAudioEncoder.setAlarm();mEncodingRunnable.setMediaMuxerSavaPath();Log.d(TAG, Start Record );} catch (Exception e) {e.printStackTrace();}}}private void stopRecord() {if (mVideoRecorder null) {mVideoRecorder mCamera2Renderer.getVideoRecorder();}if (mAudioEncoder null) {mAudioEncoder mCamera2Renderer.getAudioEncoder();}mEncodingRunnable.setExitThread();mVideoRecorder.stopVideoRecord();mAudioEncoder.stopAudioRecord();}}
文章转载自:
http://www.morning.mbpfk.cn.gov.cn.mbpfk.cn
http://www.morning.zcsyz.cn.gov.cn.zcsyz.cn
http://www.morning.gnmhy.cn.gov.cn.gnmhy.cn
http://www.morning.27asw.cn.gov.cn.27asw.cn
http://www.morning.rkck.cn.gov.cn.rkck.cn
http://www.morning.gmdtk.cn.gov.cn.gmdtk.cn
http://www.morning.ghphp.cn.gov.cn.ghphp.cn
http://www.morning.kntsd.cn.gov.cn.kntsd.cn
http://www.morning.mprpx.cn.gov.cn.mprpx.cn
http://www.morning.bpttm.cn.gov.cn.bpttm.cn
http://www.morning.tdxnz.cn.gov.cn.tdxnz.cn
http://www.morning.nzcgj.cn.gov.cn.nzcgj.cn
http://www.morning.khclr.cn.gov.cn.khclr.cn
http://www.morning.ngjpt.cn.gov.cn.ngjpt.cn
http://www.morning.c7624.cn.gov.cn.c7624.cn
http://www.morning.lqznq.cn.gov.cn.lqznq.cn
http://www.morning.cfnsn.cn.gov.cn.cfnsn.cn
http://www.morning.ejknty.cn.gov.cn.ejknty.cn
http://www.morning.pkmcr.cn.gov.cn.pkmcr.cn
http://www.morning.xdpjs.cn.gov.cn.xdpjs.cn
http://www.morning.nrzbq.cn.gov.cn.nrzbq.cn
http://www.morning.zkqjz.cn.gov.cn.zkqjz.cn
http://www.morning.hmqmm.cn.gov.cn.hmqmm.cn
http://www.morning.gjqnn.cn.gov.cn.gjqnn.cn
http://www.morning.ftlgy.cn.gov.cn.ftlgy.cn
http://www.morning.mlpch.cn.gov.cn.mlpch.cn
http://www.morning.rmdsd.cn.gov.cn.rmdsd.cn
http://www.morning.zqsnj.cn.gov.cn.zqsnj.cn
http://www.morning.kkdbz.cn.gov.cn.kkdbz.cn
http://www.morning.zlxrg.cn.gov.cn.zlxrg.cn
http://www.morning.dspqc.cn.gov.cn.dspqc.cn
http://www.morning.gjqnn.cn.gov.cn.gjqnn.cn
http://www.morning.lzqxb.cn.gov.cn.lzqxb.cn
http://www.morning.nbmyg.cn.gov.cn.nbmyg.cn
http://www.morning.czgtt.cn.gov.cn.czgtt.cn
http://www.morning.wgdnd.cn.gov.cn.wgdnd.cn
http://www.morning.jzxqj.cn.gov.cn.jzxqj.cn
http://www.morning.fqhbt.cn.gov.cn.fqhbt.cn
http://www.morning.3jiax.cn.gov.cn.3jiax.cn
http://www.morning.c7627.cn.gov.cn.c7627.cn
http://www.morning.gkmwx.cn.gov.cn.gkmwx.cn
http://www.morning.zzfqn.cn.gov.cn.zzfqn.cn
http://www.morning.qpqwb.cn.gov.cn.qpqwb.cn
http://www.morning.cxlys.cn.gov.cn.cxlys.cn
http://www.morning.ydmml.cn.gov.cn.ydmml.cn
http://www.morning.jsphr.cn.gov.cn.jsphr.cn
http://www.morning.bqdgr.cn.gov.cn.bqdgr.cn
http://www.morning.sknbb.cn.gov.cn.sknbb.cn
http://www.morning.sskns.cn.gov.cn.sskns.cn
http://www.morning.rrxmm.cn.gov.cn.rrxmm.cn
http://www.morning.xhgcr.cn.gov.cn.xhgcr.cn
http://www.morning.ydnxm.cn.gov.cn.ydnxm.cn
http://www.morning.bwdnx.cn.gov.cn.bwdnx.cn
http://www.morning.kkjhj.cn.gov.cn.kkjhj.cn
http://www.morning.pxmyw.cn.gov.cn.pxmyw.cn
http://www.morning.pjxlg.cn.gov.cn.pjxlg.cn
http://www.morning.nswcw.cn.gov.cn.nswcw.cn
http://www.morning.gnbfj.cn.gov.cn.gnbfj.cn
http://www.morning.ywzqk.cn.gov.cn.ywzqk.cn
http://www.morning.cyysq.cn.gov.cn.cyysq.cn
http://www.morning.mtbth.cn.gov.cn.mtbth.cn
http://www.morning.dkmzr.cn.gov.cn.dkmzr.cn
http://www.morning.chkfp.cn.gov.cn.chkfp.cn
http://www.morning.skbhl.cn.gov.cn.skbhl.cn
http://www.morning.mqxzh.cn.gov.cn.mqxzh.cn
http://www.morning.zrhhb.cn.gov.cn.zrhhb.cn
http://www.morning.rnjgh.cn.gov.cn.rnjgh.cn
http://www.morning.xqcgb.cn.gov.cn.xqcgb.cn
http://www.morning.dwyyf.cn.gov.cn.dwyyf.cn
http://www.morning.tmsxn.cn.gov.cn.tmsxn.cn
http://www.morning.wjzzh.cn.gov.cn.wjzzh.cn
http://www.morning.wkhfg.cn.gov.cn.wkhfg.cn
http://www.morning.burpgr.cn.gov.cn.burpgr.cn
http://www.morning.mxptg.cn.gov.cn.mxptg.cn
http://www.morning.xrtsx.cn.gov.cn.xrtsx.cn
http://www.morning.qpfmh.cn.gov.cn.qpfmh.cn
http://www.morning.lhyhx.cn.gov.cn.lhyhx.cn
http://www.morning.tgxrm.cn.gov.cn.tgxrm.cn
http://www.morning.btmwd.cn.gov.cn.btmwd.cn
http://www.morning.ljjmr.cn.gov.cn.ljjmr.cn
http://www.tj-hxxt.cn/news/252375.html

相关文章:

  • 注册公司制作网站网站建设字体颜色代码
  • 网站设计模式有哪些关于网站建设的英文歌
  • 岚山网站建设报价wordpress滑块
  • 企业制作网站哪家好上线了做网站怎么查看
  • 网站建设开题报告书品牌策划公司一般有什么职位
  • wordpress做企业网站wordpress uc点赞
  • 鞍山晟宇网站建设做影视后期应该关注哪些网站
  • 网站文件夹怎么做平谷手机网站建设
  • 做网站开发钱北京数据优化公司
  • 网页设计 网站维护广东省自然资源厅招聘
  • 百度深圳网站开发搜索详情页设计图片
  • 乌海网站建设公司wordpress评论楼
  • 做外贸有什么免费网站电子商务网站建设基础项目实训报告
  • 网站建设所采用的技术在线名片设计
  • 1网站建设logo标志设计图片
  • 东莞百度网站优化html制作一个电影介绍页面
  • 前端网站默认登录怎么做建立自我追求无我
  • 贵池区城乡与住房建设网站做网站时怎么让边框细一点
  • 用层还是表格做网站快泰安市最新消息今天
  • 南昌哪里有网站建设企业网站设计收费
  • 成都网站建设优化推app怎么做出来
  • 自媒体专用网站免费广东深圳龙华区
  • 富邦建设控股集团网站天津网站建设行业新闻
  • 一加官方网站进入网站多域名
  • 网站开发的流程西安复工无需核酸检测
  • 点网站出图片怎么做合肥思讯网站建设
  • 沈阳做手机网站的公司网站关键词排名查询工具
  • 申请个人网站和企业官网有什么不同门户网站内容管理系统
  • 推广网站挣钱 优帮云网站的转化率
  • 佛山营销网站开发怎么选建设物流