手把手教你作個AR塗塗樂

前段時間公司有一個AR塗塗樂的項目,雖然以前接觸過AR也寫太小Demo,可是沒有完整開發過AR項目.不過通過1個多星期的學習,如今已經把項目相關的技術都學會了,在此向互聯網上那些樂於分享的程序員前輩們致敬.學習的過程當中我發現好多博客只有代碼沒有講解,在這裏我就寫一個詳細一點的塗塗樂教程吧.html

一.AR塗塗樂原理

目前市場上全部的AR產品中,塗塗樂是一個作的比較成功的產品,由於其形象 生動 新奇的特色,在早教行業內很受歡迎.其實AR塗塗樂的原理很是簡單,就是把被看成畫畫工具的識別圖上的顏色材質渲染到空白的模型上.程序員

二.製做流程

我大概總結了下從模型到AR應用涉及到的具體流程,以下:編程

  1. 美術製做AR應用中須要用到的模型和動畫
  2. 模型製做好以後,對照識別圖上的模型外形匹配好UV
  3. UV匹配好以後把模型和識別圖交付程序,程序登陸Vuforia官網添加證書和目標數據
  4. 程序把Vuforia插件和目標數據分別下好並導入工程
  5. 刪除默認場景中的攝像機,添加ARCamera和ImageTarget到場景並配置好參數
  6. 把塗塗樂須要用到的模型放入場景,把模型的動畫切好
  7. 求識別圖4個頂點的位置,獲取一幀圖像,把這些參數傳給Shader,Shader處理事後,識別圖上的顏色就渲染到模型上了
  8. 導出的手機,我這裏是安卓平臺

三.圖文教程

這裏以我最近寫的項目爲例:app

1.美術不是本身親自作的,就寫下我當時作項目時對美術的要求吧:

  • UV要展好,和識別圖匹配好
  • 帶動畫的模型要保持獨立,不能和其餘模型在一塊兒
  • 模型動畫要有周期性2遍爲一個週期
  • 識別圖要對比強烈
  • 命名規範要有意義同一模型的不一樣組件要分組
  • 單位M
  • 格式FBX

交付:

  • 模型:

    2.png

  • 識別圖:

    1.png

這個模型只有花是用來畫畫的,所以識別圖是一個空白的花.ide

2.Vuforia準備工做(許可證 和 識別圖數據)

  • 沒有註冊過的先註冊下,Vuforia註冊時要注意的一點是密碼要包含大小寫和特殊符號
  • 註冊後點擊Develop --->Add License Key

3.png

  • 正常測試選Development就好了,而後取個名字點Next

4.png

  • 而後確認就行了

5.png

  • 這裏的License Key是待會項目中要用到的 先用記事本保存下來備用

Paste_Image.png

  • 到了添加識別圖的時候了,先建立一個Target Database
    Paste_Image.png

Paste_Image.png

  • 而後點擊剛纔建立的Target Database

Paste_Image.png

  • 添加識別圖

Paste_Image.png

  • 靜靜等待 不要關閉

Paste_Image.png

  • 下載識別圖數據備用

Paste_Image.png

  • 下載Vuforia插件備用

Paste_Image.png

3.Unity配置階段

  • 配置安卓開發環境(以前寫過的博客,很具體)電梯函數

  • 轉換開發平臺(不提早轉換導出時有可能會報錯)工具

Paste_Image.png

  • 依次導入Vuforia插件包,識別圖數據包,美術資源

Paste_Image.png

  • 刪除場景中的Camera,在場景中添加AR Camera和Image Target

Paste_Image.png

  • 配置AR Camera

Paste_Image.png

  • 配置Image Target

Paste_Image.png

  • 把模型拖入Image Target並調整好位置

Paste_Image.png

4.編程階段

整體思想:編程階段作的主要工做是把識別圖上的材質信息通過計算賦給模型,這裏由於動畫的緣由,每一個模型可能涉及不少單個小模塊,每一個單個的模塊都要通過這個計算,在這裏,花的7個花瓣是獨立的,也就須要7次運算.學習

  • 核心代碼
using UnityEngine;
using Vuforia;
using System.Collections;

public class ARRender : MonoBehaviour
{

    public GameObject Scene;

    private Animator flowerAnimator;
    
    //七色花的七個花瓣
    public GameObject flower1;
    public GameObject flower2;
    public GameObject flower3;
    public GameObject flower4;
    public GameObject flower5;
    public GameObject flower6;
    public GameObject flower7;

    private Texture2D texture;
    //申請Texture2D變量儲存屏幕截圖

    private int screenWidth;
    //保存屏幕寬度
    private int screenHeight;
    //保存屏幕高度

    //拾取真正貼圖的四個點的座標
    Vector3 targetAnglePoint1;
    //左上角座標
    Vector3 targetAnglePoint2;
    //左下角座標
    Vector3 targetAnglePoint3;
    //右上角座標
    Vector3 targetAnglePoint4;
    //右下角座標

    public GameObject plane;
    //儲存肯定貼圖大小的面片物體

    Vector2 halfSize;
    //記錄plane寬高的一半值


    void Start()
    {

        screenWidth = Screen.width;
        //屏幕寬
        screenHeight = Screen.height;
        //屏幕高
        
        texture = new Texture2D(screenWidth, screenHeight, TextureFormat.RGB24, false);//實例化空紋理

        flowerAnimator = this.GetComponent<Animator>();

    }


    //截屏函數
    public void ScreenShot()
    {
       Scene.SetActive(true);
       
        flowerAnimator.SetTrigger("FlowerRainbow");
        

        texture.ReadPixels(new Rect(0, 0, screenWidth, screenHeight), 0, 0);
        //讀取屏幕像素信息
        texture.Apply();
        //存儲爲紋理數據

        halfSize = new Vector2(plane.GetComponent<MeshFilter>().mesh.bounds.size.x, plane.GetComponent<MeshFilter>().mesh.bounds.size.z) * 50.0f*0.5f;
        //獲取Plane的長寬的一半值

        //肯定真實貼圖的世界座標
        targetAnglePoint1 = transform.parent.position + new Vector3(-halfSize.x, 0, halfSize.y);
        targetAnglePoint2 = transform.parent.position + new Vector3(-halfSize.x, 0, -halfSize.y);
        targetAnglePoint3 = transform.parent.position + new Vector3(halfSize.x, 0, halfSize.y);
        targetAnglePoint4 = transform.parent.position + new Vector3(halfSize.x, 0, -halfSize.y);

        //獲取VP值
        Matrix4x4 P = GL.GetGPUProjectionMatrix(Camera.main.projectionMatrix, false);
        Matrix4x4 V = Camera.main.worldToCameraMatrix;
        Matrix4x4 VP = P * V;

        //給地球的Shader傳遞貼圖四個點的世界座標,VP,以及貼圖
        flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower1.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower1.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower1.GetComponent<Renderer>().material.mainTexture = texture;

        flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower2.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower2.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower2.GetComponent<Renderer>().material.mainTexture = texture;

        flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower3.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower3.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower3.GetComponent<Renderer>().material.mainTexture = texture;

        flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower4.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower4.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower4.GetComponent<Renderer>().material.mainTexture = texture;

        flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower5.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower5.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower5.GetComponent<Renderer>().material.mainTexture = texture;

        flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower6.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower6.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower6.GetComponent<Renderer>().material.mainTexture = texture;

        flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint1", new Vector4(targetAnglePoint1.x, targetAnglePoint1.y, targetAnglePoint1.z, 1f));
        flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint2", new Vector4(targetAnglePoint2.x, targetAnglePoint2.y, targetAnglePoint2.z, 1f));
        flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint3", new Vector4(targetAnglePoint3.x, targetAnglePoint3.y, targetAnglePoint3.z, 1f));
        flower7.GetComponent<Renderer>().material.SetVector("_Uvpoint4", new Vector4(targetAnglePoint4.x, targetAnglePoint4.y, targetAnglePoint4.z, 1f));
        flower7.GetComponent<Renderer>().material.SetMatrix("_VP", VP);
        flower7.GetComponent<Renderer>().material.mainTexture = texture;        
    }
}
  • Shader
Shader "AR paint/ToMaterial" {
    Properties {
        _MainTex ("Base (RGB)", 2D) = "white" {}
        _Uvpoint1("point1", Vector) = (0 , 0 , 0 , 0)
        _Uvpoint2("point2", Vector) = (0 , 0 , 0 , 0)
        _Uvpoint3("point3", Vector) = (0 , 0 , 0 , 0)
        _Uvpoint4("point4", Vector) = (0 , 0 , 0 , 0)

    }
    SubShader {
        Tags { "Queue"="Transparent" "RenderType"="Transparent" }
        LOD 200

        Pass{
            Blend SrcAlpha OneMinusSrcAlpha

            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"

            sampler2D _MainTex;
            float4 _MainTex_ST;
            float4 _Uvpoint1;
            float4 _Uvpoint2;
            float4 _Uvpoint3;
            float4 _Uvpoint4;
            float4x4 _VP;

            struct v2f {
                float4  pos : SV_POSITION;
                float2  uv : TEXCOORD0;
                float4  fixedPos : TEXCOORD2;
            } ;

            v2f vert (appdata_base v)
            {
                v2f o;
                o.pos = mul(UNITY_MATRIX_MVP,v.vertex);
                o.uv = TRANSFORM_TEX(v.texcoord,_MainTex);
                
                float4 top = lerp(_Uvpoint1, _Uvpoint3, o.uv.x);
                float4 bottom = lerp(_Uvpoint2, _Uvpoint4, o.uv.x);
                float4 fixedPos = lerp(bottom, top, o.uv.y);
                o.fixedPos = ComputeScreenPos(mul(UNITY_MATRIX_VP, fixedPos));
                return o;
            }

            float4 frag (v2f i) : COLOR
            {
                
                float4 top = lerp(_Uvpoint1, _Uvpoint3, i.uv.x);
                float4 bottom = lerp(_Uvpoint2, _Uvpoint4, i.uv.x);
                float4 fixedPos = lerp(bottom, top, i.uv.y);
                fixedPos = ComputeScreenPos(mul(_VP, fixedPos));
                return tex2D(_MainTex, fixedPos.xy / fixedPos.w);
                
            }
            ENDCG
        }
    }
    //FallBack "Diffuse"
}
  • 首先在ImageTarget下新建一個Plane和識別圖大小一致,核心代碼的功能就是把場景中Plane(也就是識別圖)的世界座標轉換到屏幕座標,而後截圖造成紋理,最後把這些數據傳給Shader處理.

Paste_Image.png

  • 把模型拖入ImageTarget,裝飾場景也拖進去先隱藏,而後把須要渲染的模塊拖入ARRender的代碼參數,而後新建一個材質,Shader選擇ARpaint/ToMaterial,把這個材質賦給全部要塗色渲染的模塊,也就是7個花瓣.
  • 新建一個Button,添加事件拖入Flower,方法選擇ScreenShot
  • 導出安裝包到手機,測試,這是個人測試結果:

85.jpg

8.png

  • 關於動畫

  • 動畫拿到以後根據須要進行切割並Apply

Paste_Image.png

  • 新建一個動畫控制器,把剛纔切好的動畫拖到動畫狀態機裏面,右鍵創建Transition,Parameters選項卡中建立Trigger,這樣就能夠在程序中控制動畫了測試

  • 關於音效

  • 新建一個空物體,起名Audio
  • 添加AudioSource組件
  • 拖入ImageTarget
  • 找到ImageTarget下的 DefaultTrackableEventHandler 腳本 聲明AudioSource變量而後分別在 OnTrackingFound() 和 OnTrackingLost()這兩個方法裏添加聲音暫停和開始方法動畫

public class DefaultTrackableEventHandler : MonoBehaviour,
                                                ITrackableEventHandler
    {
        public AudioSource clothesAudioSource;
        #region PRIVATE_MEMBER_VARIABLES
 
        private TrackableBehaviour mTrackableBehaviour;
    
        #endregion // PRIVATE_MEMBER_VARIABLES



        #region UNTIY_MONOBEHAVIOUR_METHODS
    
        void Start()
        {
            mTrackableBehaviour = GetComponent<TrackableBehaviour>();
            if (mTrackableBehaviour)
            {
                mTrackableBehaviour.RegisterTrackableEventHandler(this);
            }
        }

        #endregion // UNTIY_MONOBEHAVIOUR_METHODS



        #region PUBLIC_METHODS

        /// <summary>
        /// Implementation of the ITrackableEventHandler function called when the
        /// tracking state changes.
        /// </summary>
        public void OnTrackableStateChanged(
                                        TrackableBehaviour.Status previousStatus,
                                        TrackableBehaviour.Status newStatus)
        {
            if (newStatus == TrackableBehaviour.Status.DETECTED ||
                newStatus == TrackableBehaviour.Status.TRACKED ||
                newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
            {
                OnTrackingFound();
            }
            else
            {
                OnTrackingLost();
            }
        }

        #endregion // PUBLIC_METHODS



        #region PRIVATE_METHODS


        private void OnTrackingFound()
        {
            //gameObject.transform.FindChild ("Earth").gameObject.SetActive (true);
            Renderer[] rendererComponents = GetComponentsInChildren<Renderer>(true);
            Collider[] colliderComponents = GetComponentsInChildren<Collider>(true);

            // Enable rendering:
            foreach (Renderer component in rendererComponents)
            {
                component.enabled = true;
            }

            // Enable colliders:
            foreach (Collider component in colliderComponents)
            {
                component.enabled = true;
            }
            if (!clothesAudioSource.isPlaying)
            {
               clothesAudioSource.Play(); 
            }
            

            Debug.Log("Trackable " + mTrackableBehaviour.TrackableName + " found");
        }


        private void OnTrackingLost()
        {
            Renderer[] rendererComponents = GetComponentsInChildren<Renderer>(true);
            Collider[] colliderComponents = GetComponentsInChildren<Collider>(true);

            // Disable rendering:
            foreach (Renderer component in rendererComponents)
            {
                component.enabled = false;
            }

            // Disable colliders:
            foreach (Collider component in colliderComponents)
            {
                component.enabled = false;
            }

            clothesAudioSource.Pause();

            Debug.Log("Trackable " + mTrackableBehaviour.TrackableName + " lost");
        }

        #endregion // PRIVATE_METHODS
    }
}
  • 關於對焦

    Vuforia默認不會自動對焦,新建一個腳本,把下面代碼複製進去,把腳本拖到ARCamera上面就行了

using UnityEngine;
using System.Collections;

public class Duijiao : MonoBehaviour
{

    // Use this for initialization
    void Start()
    {
        GameObject ARCamera = GameObject.Find("ARCamera");

        Vuforia.CameraDevice.Instance.SetFocusMode(Vuforia.CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
    }

    // Update is called once per frame
    void Update()
    {
        Vuforia.CameraDevice.Instance.SetFocusMode(Vuforia.CameraDevice.FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
    }
}
  • 關於多圖識別

  • 設置目標上限

Paste_Image.png

  • 拖入更多的ImageTarget,每一個ImageTarget下配置不一樣的識別圖和模型就能多圖識別了

Paste_Image.png

  • 測試結果

Paste_Image.png

因爲尚未重構,代碼比較醜陋,不過好處是適合新手,很好理解

相關文章
相關標籤/搜索