分享WebGL物體三維建模

界面效果

代碼結構

模型素材類似CT (Computed Tomography),即電子計算機斷層掃描,它是利用精確準直的X線束、γ射線、超聲波等,與靈敏度極高的探測器一同圍繞物體的某一部位作一個接一個的斷面掃描。

坐標系統

渲染流程

渲染流程是個將之前準備好的模型輸出到屏幕的過程。3D 渲染流程會接受使用頂點描述 3D 物體的原始數據作為輸入用于處理,并計算其片段 (fragment), 然后渲染為像素 (pixels) 輸出到屏幕。

著色器

使用 GLSL 的著色器(shader),GLSL 是一門特殊的有著類似于 C 語言的語法,在圖形管道 (graphic pipeline) 中直接可執行的 OpenGL 著色語言。著色器有兩種類型——頂點著色器 (Vertex Shader) 和片段著色器(Fragment Shader)。前者是將形狀轉換到真實的 3D 繪制坐標中,后者是計算最終渲染的顏色和其他屬性用的。

GLSL 不同于 JavaScript, 它是強類型語言,并且內置很多數學公式用于計算向量和矩陣。快速編寫著色器非常復雜,但創建一個簡單的著色器并不難。

一個著色器實際上就是一個繪制東西到屏幕上的函數。著色器運行在 GPU 中,它對這些操作進行了很多的優化,這樣你就可以卸載很多不必要的 CPU, 然后集中處理能力去執行你自己的代碼。

頂點著色器操作 3D 空間的坐標并且每個頂點都會調用一次這個函數。其目的是設置?gl_Position?變量——這是一個特殊的全局內置變量,它是用來存儲當前頂點的位置。

片段 (或者紋理) 著色器 在計算時定義了每像素的 RGBA 顏色 — 每個像素只調用一次片段著色器。這個著色器的作用是設置?gl_FragColor?變量,也是一個 GLSL 內置變量。

        <script id="fragmentShaderFirstPass" type="x-shader/x-fragment">varying vec3 worldSpaceCoords;void main(){//The fragment's world space coordinates as fragment output.gl_FragColor = vec4( worldSpaceCoords.x , worldSpaceCoords.y, worldSpaceCoords.z, 1 );}</script><script id="vertexShaderFirstPass" type="x-shader/x-vertex">varying vec3 worldSpaceCoords;void main(){//Set the world space coordinates of the back faces vertices as output.worldSpaceCoords = position + vec3(0.5, 0.5, 0.5); //move it from [-0.5;0.5] to [0,1]gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );}</script><script id="fragmentShaderSecondPass" type="x-shader/x-fragment">varying vec3 worldSpaceCoords;varying vec4 projectedCoords;uniform sampler2D tex, cubeTex, transferTex;uniform float steps;uniform float alphaCorrection;// The maximum distance through our rendering volume is sqrt(3).// The maximum number of steps we take to travel a distance of 1 is 512.// ceil( sqrt(3) * 512 ) = 887// This prevents the back of the image from getting cut off when steps=512 & viewing diagonally.const int MAX_STEPS = 887;//Acts like a texture3D using Z slices and trilinear filtering.vec4 sampleAs3DTexture( vec3 texCoord ){vec4 colorSlice1, colorSlice2;vec2 texCoordSlice1, texCoordSlice2;//The z coordinate determines which Z slice we have to look for.//Z slice number goes from 0 to 255.float zSliceNumber1 = floor(texCoord.z  * 255.0);//As we use trilinear we go the next Z slice.float zSliceNumber2 = min( zSliceNumber1 + 1.0, 255.0); //Clamp to 255//The Z slices are stored in a matrix of 16x16 of Z slices.//The original UV coordinates have to be rescaled by the tile numbers in each row and column.texCoord.xy /= 16.0;texCoordSlice1 = texCoordSlice2 = texCoord.xy;//Add an offset to the original UV coordinates depending on the row and column number.texCoordSlice1.x += (mod(zSliceNumber1, 16.0 ) / 16.0);texCoordSlice1.y += floor((255.0 - zSliceNumber1) / 16.0) / 16.0;texCoordSlice2.x += (mod(zSliceNumber2, 16.0 ) / 16.0);texCoordSlice2.y += floor((255.0 - zSliceNumber2) / 16.0) / 16.0;//Get the opacity value from the 2D texture.//Bilinear filtering is done at each texture2D by default.colorSlice1 = texture2D( cubeTex, texCoordSlice1 );colorSlice2 = texture2D( cubeTex, texCoordSlice2 );//Based on the opacity obtained earlier, get the RGB color in the transfer function texture.colorSlice1.rgb = texture2D( transferTex, vec2( colorSlice1.a, 1.0) ).rgb;colorSlice2.rgb = texture2D( transferTex, vec2( colorSlice2.a, 1.0) ).rgb;//How distant is zSlice1 to ZSlice2. Used to interpolate between one Z slice and the other.float zDifference = mod(texCoord.z * 255.0, 1.0);//Finally interpolate between the two intermediate colors of each Z slice.return mix(colorSlice1, colorSlice2, zDifference) ;}void main( void ) {//Transform the coordinates it from [-1;1] to [0;1]vec2 texc = vec2(((projectedCoords.x / projectedCoords.w) + 1.0 ) / 2.0,((projectedCoords.y / projectedCoords.w) + 1.0 ) / 2.0 );//The back position is the world space position stored in the texture.vec3 backPos = texture2D(tex, texc).xyz;//The front position is the world space position of the second render pass.vec3 frontPos = worldSpaceCoords;//The direction from the front position to back position.vec3 dir = backPos - frontPos;float rayLength = length(dir);//Calculate how long to increment in each step.float delta = 1.0 / steps;//The increment in each direction for each step.vec3 deltaDirection = normalize(dir) * delta;float deltaDirectionLength = length(deltaDirection);//Start the ray casting from the front position.vec3 currentPosition = frontPos;//The color accumulator.vec4 accumulatedColor = vec4(0.0);//The alpha value accumulated so far.float accumulatedAlpha = 0.0;//How long has the ray travelled so far.float accumulatedLength = 0.0;//If we have twice as many samples, we only need ~1/2 the alpha per sample.//Scaling by 256/10 just happens to give a good value for the alphaCorrection slider.float alphaScaleFactor = 25.6 * delta;vec4 colorSample;float alphaSample;//Perform the ray marching iterationsfor(int i = 0; i < MAX_STEPS; i++){//Get the voxel intensity value from the 3D texture.colorSample = sampleAs3DTexture( currentPosition );//Allow the alpha correction customization.alphaSample = colorSample.a * alphaCorrection;//Applying this effect to both the color and alpha accumulation results in more realistic transparency.alphaSample *= (1.0 - accumulatedAlpha);//Scaling alpha by the number of steps makes the final color invariant to the step size.alphaSample *= alphaScaleFactor;//Perform the composition.accumulatedColor += colorSample * alphaSample;//Store the alpha accumulated so far.accumulatedAlpha += alphaSample;//Advance the ray.currentPosition += deltaDirection;accumulatedLength += deltaDirectionLength;//If the length traversed is more than the ray length, or if the alpha accumulated reaches 1.0 then exit.if(accumulatedLength >= rayLength || accumulatedAlpha >= 1.0 )break;}gl_FragColor  = accumulatedColor;}</script><script id="vertexShaderSecondPass" type="x-shader/x-vertex">varying vec3 worldSpaceCoords;varying vec4 projectedCoords;void main(){worldSpaceCoords = (modelMatrix * vec4(position + vec3(0.5, 0.5,0.5), 1.0 )).xyz;gl_Position = projectionMatrix *  modelViewMatrix * vec4( position, 1.0 );projectedCoords =  projectionMatrix * modelViewMatrix * vec4( position, 1.0 );}</script>

三維場景構建

         <script>if ( ! Detector.webgl ) Detector.addGetWebGLMessage();var container, stats;var camera, sceneFirstPass, sceneSecondPass, renderer;var clock = new THREE.Clock();var rtTexture, transferTexture;var cubeTextures = ['bonsai', 'foot', 'teapot'];var histogram = [];var guiControls;var materialSecondPass;init();//animate();function init() {//Parameters that can be modified.guiControls = new function() {this.model = 'bonsai';this.steps = 256.0;this.alphaCorrection = 1.0;this.color1 = "#00FA58";this.stepPos1 = 0.1;this.color2 = "#CC6600";this.stepPos2 = 0.7;this.color3 = "#F2F200";this.stepPos3 = 1.0;};container = document.getElementById( 'container' );camera = new THREE.PerspectiveCamera( 40, window.innerWidth / window.innerHeight, 0.01, 3000.0 );camera.position.z = 2.0;controls = new THREE.OrbitControls( camera, container );controls.center.set( 0.0, 0.0, 0.0 );//Load the 2D texture containing the Z slices.THREE.ImageUtils.crossOrigin = 'anonymous'; //處理紋理圖加載跨域問題cubeTextures['bonsai'] = THREE.ImageUtils.loadTexture('./images/bonsai.raw.png');cubeTextures['teapot'] = THREE.ImageUtils.loadTexture('./images/teapot.raw.png');cubeTextures['foot'] = THREE.ImageUtils.loadTexture('./images/foot.raw.png');//Don't let it generate mipmaps to save memory and apply linear filtering to prevent use of LOD.cubeTextures['bonsai'].generateMipmaps = false;cubeTextures['bonsai'].minFilter = THREE.LinearFilter;cubeTextures['bonsai'].magFilter = THREE.LinearFilter;cubeTextures['teapot'].generateMipmaps = false;cubeTextures['teapot'].minFilter = THREE.LinearFilter;cubeTextures['teapot'].magFilter = THREE.LinearFilter;cubeTextures['foot'].generateMipmaps = false;cubeTextures['foot'].minFilter = THREE.LinearFilter;cubeTextures['foot'].magFilter = THREE.LinearFilter;var transferTexture = updateTransferFunction();var screenSize = new THREE.Vector2( window.innerWidth, window.innerHeight );rtTexture = new THREE.WebGLRenderTarget( screenSize.x, screenSize.y,{ 	minFilter: THREE.LinearFilter,magFilter: THREE.LinearFilter,wrapS:  THREE.ClampToEdgeWrapping,wrapT:  THREE.ClampToEdgeWrapping,format: THREE.RGBFormat,type: THREE.FloatType,generateMipmaps: false} );var materialFirstPass = new THREE.ShaderMaterial( {vertexShader: document.getElementById( 'vertexShaderFirstPass' ).textContent,fragmentShader: document.getElementById( 'fragmentShaderFirstPass' ).textContent,side: THREE.BackSide} );materialSecondPass = new THREE.ShaderMaterial( {vertexShader: document.getElementById( 'vertexShaderSecondPass' ).textContent,fragmentShader: document.getElementById( 'fragmentShaderSecondPass' ).textContent,side: THREE.FrontSide,uniforms: {	tex:  { type: "t", value: rtTexture },cubeTex:  { type: "t", value: cubeTextures['bonsai'] },transferTex:  { type: "t", value: transferTexture },steps : {type: "1f" , value: guiControls.steps },alphaCorrection : {type: "1f" , value: guiControls.alphaCorrection }}});sceneFirstPass = new THREE.Scene();sceneSecondPass = new THREE.Scene();var boxGeometry = new THREE.BoxGeometry(1.0, 1.0, 1.0);boxGeometry.doubleSided = true;var meshFirstPass = new THREE.Mesh( boxGeometry, materialFirstPass );var meshSecondPass = new THREE.Mesh( boxGeometry, materialSecondPass );sceneFirstPass.add( meshFirstPass );sceneSecondPass.add( meshSecondPass );renderer = new THREE.WebGLRenderer();container.appendChild( renderer.domElement );stats = new Stats();stats.domElement.style.position = 'absolute';stats.domElement.style.top = '0px';container.appendChild( stats.domElement );var gui = new dat.GUI();var modelSelected = gui.add(guiControls, 'model', [ 'bonsai', 'foot', 'teapot' ] );gui.add(guiControls, 'steps', 0.0, 512.0);gui.add(guiControls, 'alphaCorrection', 0.01, 5.0).step(0.01);modelSelected.onChange(function(value) { materialSecondPass.uniforms.cubeTex.value =  cubeTextures[value]; } );//Setup transfer function steps.var step1Folder = gui.addFolder('Step 1');var controllerColor1 = step1Folder.addColor(guiControls, 'color1');var controllerStepPos1 = step1Folder.add(guiControls, 'stepPos1', 0.0, 1.0);controllerColor1.onChange(updateTextures);controllerStepPos1.onChange(updateTextures);var step2Folder = gui.addFolder('Step 2');var controllerColor2 = step2Folder.addColor(guiControls, 'color2');var controllerStepPos2 = step2Folder.add(guiControls, 'stepPos2', 0.0, 1.0);controllerColor2.onChange(updateTextures);controllerStepPos2.onChange(updateTextures);var step3Folder = gui.addFolder('Step 3');var controllerColor3 = step3Folder.addColor(guiControls, 'color3');var controllerStepPos3 = step3Folder.add(guiControls, 'stepPos3', 0.0, 1.0);controllerColor3.onChange(updateTextures);controllerStepPos3.onChange(updateTextures);step1Folder.open();step2Folder.open();step3Folder.open();onWindowResize();window.addEventListener( 'resize', onWindowResize, false );}function imageLoad(url){   var image = document.createElement('img');image.crossOrigin = '';image.src = url;var loader = new THREE.TextureLoader();// load a resourceconst textureAssets = nullloader.load(// resource URLimage,// Function when resource is loadedfunction ( texture ) {// do something with the texturetexture.wrapS = THREE.RepeatWrapping;texture.wrapT = THREE.RepeatWrapping;texture.offset.x = 90/(2*Math.PI);textureAssets = texture},// Function called when download progressesfunction ( xhr ) {console.log( (xhr.loaded / xhr.total * 100) + '% loaded' );},// Function called when download errorsfunction ( xhr ) {console.log( 'An error happened' );});return textureAssets; // return the texture}function updateTextures(value){materialSecondPass.uniforms.transferTex.value = updateTransferFunction();}function updateTransferFunction(){var canvas = document.createElement('canvas');canvas.height = 20;canvas.width = 256;var ctx = canvas.getContext('2d');var grd = ctx.createLinearGradient(0, 0, canvas.width -1 , canvas.height - 1);grd.addColorStop(guiControls.stepPos1, guiControls.color1);grd.addColorStop(guiControls.stepPos2, guiControls.color2);grd.addColorStop(guiControls.stepPos3, guiControls.color3);ctx.fillStyle = grd;ctx.fillRect(0,0,canvas.width -1 ,canvas.height -1 );var img = document.getElementById("transferFunctionImg");img.src = canvas.toDataURL();img.style.width = "256 px";img.style.height = "128 px";transferTexture =  new THREE.Texture(canvas);transferTexture.wrapS = transferTexture.wrapT =  THREE.ClampToEdgeWrapping;transferTexture.needsUpdate = true;return transferTexture;}function onWindowResize( event ) {camera.aspect = window.innerWidth / window.innerHeight;camera.updateProjectionMatrix();renderer.setSize( window.innerWidth, window.innerHeight );}function animate() {requestAnimationFrame( animate );render();stats.update();}function render() {var delta = clock.getDelta();//Render first pass and store the world space coords of the back face fragments into the texture.renderer.render( sceneFirstPass, camera, rtTexture, true );//Render the second pass and perform the volume rendering.renderer.render( sceneSecondPass, camera );materialSecondPass.uniforms.steps.value = guiControls.steps;materialSecondPass.uniforms.alphaCorrection.value = guiControls.alphaCorrection;}//Leandro R Barbagallo - 2015 - lebarba at gmail.comconst VSHADER_SOURCE = `attribute vec4 a_Position;attribute vec2 a_uv;varying vec2 v_uv;void main() {gl_Position = a_Position;v_uv = a_uv;}`const FSHADER_SOURCE = `precision mediump float;// 定義一個取樣器。sampler2D 是一種數據類型,就像 vec2uniform sampler2D u_Sampler;uniform sampler2D u_Sampler2;varying vec2 v_uv;void main() {// texture2D(sampler2D sampler, vec2 coord) - 著色器語言內置函數,從 sampler 指定的紋理上獲取 coord 指定的紋理坐標處的像素vec4 color = texture2D(u_Sampler, v_uv);vec4 color2 = texture2D(u_Sampler2, v_uv);gl_FragColor = color * color2;}`function main() {const canvas = document.getElementById('webgl');const gl = canvas.getContext("webgl");if (!gl) {console.log('Failed to get the rendering context for WebGL');return;}if (!initShaders(gl, VSHADER_SOURCE, FSHADER_SOURCE)) {console.log('Failed to intialize shaders.');return;}gl.clearColor(0.0, 0.5, 0.5, 1.0);// 幾何圖形的4個頂點的坐標const verticesOfPosition = new Float32Array([// 左下角是第一個點,逆時針-0.5, -0.5,0.5, -0.5,0.5, 0.5,-0.5, 0.5,])// 紋理的4個點的坐標const uvs = new Float32Array([// 左下角是第一個點,逆時針,與頂點坐標保持對應0.0, 0.0,1.0, 0.0,1.0, 1.0,0.0, 1.0])initVertexBuffers(gl, verticesOfPosition)initUvBuffers(gl, uvs)initTextures(gl)initMaskTextures(gl)}// 初始化紋理。之所以為復數 s 是因為可以貼多張圖片。function initTextures(gl) {// 定義圖片const img = new Image();// 請求 CORS 許可。解決圖片跨域問題img.crossOrigin = "";// The image element contains cross-origin data, and may not be loaded.img.src = "./images/bonsai.raw.png";img.onload = () => {// 創建紋理const texture = gl.createTexture();// 取得取樣器const u_Sampler = gl.getUniformLocation(gl.program, 'u_Sampler');if (!u_Sampler) {console.log('Failed to get the storage location of u_Sampler');return false;}// pixelStorei - 圖像預處理:圖片上下對稱翻轉坐標軸 (圖片本身不變)gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);// 激活紋理單元gl.activeTexture(gl.TEXTURE0);// 綁定紋理對象gl.bindTexture(gl.TEXTURE_2D, texture);// 配置紋理參數gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.MIRRORED_REPEAT);// 紋理圖片分配給紋理對象gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, img);// 將紋理單元傳給片元著色器gl.uniform1i(u_Sampler, 0);gl.clear(gl.COLOR_BUFFER_BIT);gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);}}// 初始化紋理。之所以為復數 s 是因為可以貼多張圖片。function initMaskTextures(gl) {const img = new Image();img.src = "./images/teapot.raw.png";img.onload = () => {// 創建紋理const texture = gl.createTexture();// 取得取樣器const u_Sampler = gl.getUniformLocation(gl.program, 'u_Sampler2');if (!u_Sampler) {console.log('Failed to get the storage location of u_Sampler');return false;}// pixelStorei - 圖像預處理:圖片上下對稱翻轉坐標軸 (圖片本身不變)gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);// 激活紋理單元gl.activeTexture(gl.TEXTURE1);// 綁定紋理對象gl.bindTexture(gl.TEXTURE_2D, texture);// 配置紋理參數gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);// 紋理圖片分配給紋理對象gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, img);// 將紋理單元傳給片元著色器gl.uniform1i(u_Sampler, 1);gl.clear(gl.COLOR_BUFFER_BIT);gl.drawArrays(gl.TRIANGLE_FAN, 0, 4);}}function initVertexBuffers(gl, positions) {const vertexBuffer = gl.createBuffer();if (!vertexBuffer) {console.log('創建緩沖區對象失敗');return -1;}gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);gl.bufferData(gl.ARRAY_BUFFER, positions, gl.STATIC_DRAW);const a_Position = gl.getAttribLocation(gl.program, 'a_Position');if (a_Position < 0) {console.log('Failed to get the storage location of a_Position');return -1;}gl.vertexAttribPointer(a_Position, 2, gl.FLOAT, false, 0, 0);gl.enableVertexAttribArray(a_Position);}function initUvBuffers(gl, uvs) {const uvsBuffer = gl.createBuffer();if (!uvsBuffer) {console.log('創建 uvs 緩沖區對象失敗');return -1;}gl.bindBuffer(gl.ARRAY_BUFFER, uvsBuffer);gl.bufferData(gl.ARRAY_BUFFER, uvs, gl.STATIC_DRAW);const a_uv = gl.getAttribLocation(gl.program, 'a_uv');if (a_uv < 0) {console.log('Failed to get the storage location of a_uv');return -1;}gl.vertexAttribPointer(a_uv, 2, gl.FLOAT, false, 0, 0);gl.enableVertexAttribArray(a_uv);}</script>

環境設置


你只需關注著色器代碼。Three.js 和其他 3D 庫給你抽象了很多東西出來——如果你想要用純 WebGL 創建這個例子,你得寫很多其他的代碼才能運行。要開始編寫 WebGL 著色器你不需要做太多,只需如下三步:

1、確保你在使用對 WebGL 有良好支持的現代瀏覽器,比如最新版的 Firefox 或 Chrome.
2、創建一個目錄保存你的實驗。
3、拷貝一份的 壓縮版的 Three.js 庫 到你的目錄。

參見:

LearnOpenGL - Coordinate Systems

WebGL model view projection - Web API 接口參考 | MDN

GLSL 著色器 - 游戲開發 | MDN

解釋基本的 3D 原理 - 游戲開發 | MDN

Lebarba - WebGL Volume Rendering made easy

本文來自互聯網用戶投稿,該文觀點僅代表作者本人,不代表本站立場。本站僅提供信息存儲空間服務,不擁有所有權,不承擔相關法律責任。
如若轉載,請注明出處:http://www.pswp.cn/news/696506.shtml
繁體地址,請注明出處:http://hk.pswp.cn/news/696506.shtml
英文地址,請注明出處:http://en.pswp.cn/news/696506.shtml

如若內容造成侵權/違法違規/事實不符,請聯系多彩編程網進行投訴反饋email:809451989@qq.com,一經查實,立即刪除!

相關文章

Sora:OpenAI引領AI視頻新時代

Sora - 探索AI視頻模型的無限可能 隨著人工智能技術的飛速發展&#xff0c;AI視頻模型已成為科技領域的新熱點。而在這個浪潮中&#xff0c;OpenAI推出的首個AI視頻模型Sora&#xff0c;以其卓越的性能和前瞻性的技術&#xff0c;引領著AI視頻領域的創新發展。讓我們將一起探討…

C++(12) 模板類、模板繼承(嚴格模式和自由模式)

文章目錄 模版類1. 模版類2. 模版參數限制3. 模版繼承3.1 嚴格模式3.2 自由模式 4. 模版類的模版函數5. 返回值類型帶有模版 模版類 1. 模版類 #include <iostream>using namespace std;/* 當前 Person 類型&#xff0c;聲明了連個模版分別對應NameType 模版類型&#…

C++ array容器用法詳解

array 容器是 C++ 11 標準中新增的序列容器,簡單地理解,它就是在 C++ 普通數組的基礎上,添加了一些成員函數和全局函數。在使用上,它比普通數組更安全(原因后續會講),且效率并沒有因此變差。 和其它容器不同,array 容器的大小是固定的,無法動態的擴展或收縮,這也就意…

【SpringCloud】使用 Spring Cloud Alibaba 之 Sentinel 實現微服務的限流、降級、熔斷

目錄 一、Sentinel 介紹1.1 什么是 Sentinel1.2 Sentinel 特性1.3 限流、降級與熔斷的區別 二、實戰演示2.1 下載啟動 Sentinel 控制臺2.2 后端微服務接入 Sentinel 控制臺2.2.1 引入 Sentinel 依賴2.2.2 添加 Sentinel 連接配置 2.3 使用 Sentinel 進行流控&#xff08;含限流…

SLAM ORB-SLAM2(19)特征點三角化

SLAM ORB-SLAM2(19)特征點三角化 1. 前言2. 初始化參數3. 計算投影矩陣4. 恢復三維點4.1. 計算推導4.2. Triangulate5. 檢查三維點5.1. 檢查三維點的深度值和視差角5.2. 檢查空間點的重投影誤差6. 最后處理1. 前言 在 《SLAM ORB-SLAM2(12)估算運動并初始地圖點》 中了解到…

如何將cocos2d-x js打包部署到ios上 Mac M1系統

項目環境 cocos2d-x 3.13 xcode 12 mac m1 big sur 先找到你的項目 使用xcode軟件打開上面這個文件 打開后應該是這個樣子 執行編譯運行就好了 可能會碰到的錯誤 在xcode11版本以上都會有這個錯誤&#xff0c;這是因為iOS11廢棄了system。 將上面代碼修改為 #if (CC_TARGE…

Java 面向對象進階 16 接口的細節:成員特點和接口的各種關系(黑馬)

成員變量默認修飾符是public static final的原因是&#xff1a; Java中接口中成員變量默認修飾符是public static final的原因是為了確保接口的成員變量都是公共的、靜態的和不可修改的。 - public修飾符確保了接口的成員變量可以在任何地方被訪問到。 - static修飾符使得接口…

vue-利用屬性(v-if)控制表單(el-form-item)顯示/隱藏

表單控制屬性 v-if 示例&#xff1a; 通過switch組件作為開關&#xff0c;控制表單的顯示與隱藏 <el-form-item label"創建數據集"><el-switch v-model"selectFormVisible"></el-switch></el-form-item><el-form-item label&…

Redis篇----第七篇

系列文章目錄 文章目錄 系列文章目錄前言一、Redis 的回收策略(淘汰策略)?二、為什么 edis 需要把所有數據放到內存中?三、Redis 的同步機制了解么?四、Pipeline 有什么好處,為什么要用 pipeline?前言 前些天發現了一個巨牛的人工智能學習網站,通俗易懂,風趣幽默,忍…

crontab history查看命令的執行時間

crontab crontab學習網站&#xff08;19. crontab 定時任務 — Linux Tools Quick Tutorial&#xff09; 例子 今天實際工作里用到的&#xff08;已經進行了防信息泄露處理 比如我現在希望每周三上午10:00之行一個php腳本 --gpt生成 00 10 * * 3 cd /home/user/project/r…

阿里云SSL免費證書到期自動申請部署程序

阿里云的免費證書只有3個月的有效期&#xff0c;不注意就過期了&#xff0c;還要手動申請然后部署&#xff0c;很是麻煩&#xff0c;于是寫了這個小工具。上班期間抽空寫的&#xff0c;沒有仔細測試&#xff0c;可能存在一些問題&#xff0c;大家可以自己clone代碼改改&#xf…

【大模型 數據增強】LLMAAA:使用 LLMs 作為數據標注器

【大模型 數據增強】LLMAAA&#xff1a;使用 LLMs 作為數據標注器 提出背景算法步驟1. LLM作為活躍標注者&#xff08;LLMAAA&#xff09;2. k-NN示例檢索與標簽表述化3. 活躍學習策略4. 自動重權技術 LLMAAA 框架1. LLM Annotator2. Active Acquisition3. Robust Training 總結…

SkyWalking之APM無侵入可觀測原理分析

一、 簡介&#xff08;為什么需要用到可觀測能力&#xff09; 隨著微服務的開發模式的興起&#xff0c;早期的單體架構系統已拆分為很多的子系統&#xff0c;各個子系統封裝為微服務&#xff0c;各服務間通過HTTP協議RESET API或者RPC協議進行調用。 在單體服務或者微服務較少的…

8:00面試,8:05就出來了 ,問的實在是....

從外包出來&#xff0c;沒想到竟然死在了另一家廠子 自從加入這家公司&#xff0c;每天都在加班&#xff0c;錢倒是給的不少&#xff0c;所以我也就忍了。沒想到12月一紙通知&#xff0c;所有人都不許加班&#xff0c;薪資直降30%&#xff0c;頓時有吃不起飯的趕腳。 好在有個…

AI推介-大語言模型LLMs論文速覽(arXiv方向):2024.02.05-2024.02.10

相關LLMs論文大多都是應用型文章&#xff0c;少部分是優化prompt/參數量級等等… 有一些應用文還是值得參考的&#xff0c;當工作面臨一個新的場景&#xff0c;可以學習下他人是如何結合LLMs與實際應用中的鏈接。 LLMs論文速覽&#xff1a;2024.02.05-2024.02.10&#xff1a; …

ESP8266智能家居(2)——8266發布數據到mqtt服務器

1.公共服務器 學習物聯網就離不開服務器&#xff0c;如果你資金充足的話&#xff0c;可以自己購買或者租用一個服務器。本次我選擇&#xff0c;使用免費的公共MQTT服務器。它的端口及Broker信息如下&#xff1a; 網址為&#xff1a; 免費的公共 MQTT 服務器 | EMQ (emqx.com)h…

LLMChain使用 | RouterChain的使用 - 用本地大模型搭建多Agents

單個本地大模型搭建參考博客 單個Chain&#xff1a;面對一個需求&#xff0c;我們需要創建一個llmchain&#xff0c;設置一個prompt模板&#xff0c;這個chain能夠接收一個用戶input&#xff0c;并輸出一個結果&#xff1b;多個Chain&#xff1a;考慮到同時面對多個需求&#x…

動態規劃背包問題

前言 動態規劃背包問題是一類經典的優化問題&#xff0c;涉及到選擇物品以最大化某個目標值&#xff08;通常是價值或利潤&#xff09;&#xff0c;同時受到某種約束&#xff08;如重量、體積或時間&#xff09;。背包問題可以分為多種類型&#xff0c;例如0-1背包問題、完全背…

第三百六十回

文章目錄 1. 概念介紹2. 實現方法2.1 環繞效果2.2 立體效果 3. 示例代碼4. 內容總結 我們在上一章回中介紹了"自定義SlideImageSwitch組件"相關的內容&#xff0c;本章回中將介紹兩種陰影效果.閑話休提&#xff0c;讓我們一起Talk Flutter吧。 1. 概念介紹 我們在本…

設計模式-創建型模式-原型模式

原型模式&#xff08;Prototype Pattern&#xff09;&#xff1a;使用原型實例指定創建對象的種類&#xff0c;并且通過克隆這些原型創建新的對象。原型模式是一種對象創建型模式。原型模式其實就是從一個對象再創建另外一個可定制的對象&#xff0c;而且不需知道任何創建的細節…