Construction Script
Make timelapse by Photoshop
Hyper Timelapse
Stereographic Projection
Oculus Go 申請
デリゲート
高解像度スクリーンショット
マテリアル
- Unreal Material: RotateAboutAxis not producing the expected result
- WorldPos to ScreenUV
float4 RayStartClip =
mul(float4(WorldPos, 1), View.WorldToClip);
float3 RayStartScreen =
RayStartClip.xyz / RayStartClip.w;
float2 UV = RayStartScreen.xy;
UV = UV * float2( 0.5f, 0.5f) + float2(0.5f, 0.5f);
UV.y = 1.0f - UV.y;
return UV;
Qoocam
- QooCam Youtube Tutorial : https://www.youtube.com/playlist?list=PLmjwCk5juJ-9Gbz4wmUMmhwJQbs8Y9hMj
- A new SMART way to Improve Low-light photos via RAW image stacking - Kandao RAW+
- Introducing depthmap feature in Kandao Studio
- Depth Mode Stitching Updates of Kandao Studio v2.7.0
- 360° Depth Map Stitching & Exporting – A New Features of Obsidian 3D VR Camera and Kandao Studio
- Blueprint Blending Between Multiple Skylight Cubemaps | Live Training | Unreal Engine
- Stereo Panoramic Plugin | Live Training | Unreal Engine
- Getting Started with World Composition | Live Training | Unreal Engine
- Media Framework 4.18 Preview | Live Training | Unreal Engine Livestream
- Procedural Mesh Slicing | Live Training | Unreal Engine
シーケンサー
- Sequencer Play list
- Robo Recall Sequencer Cinematics | Live Training | Unreal Engine
- Working with Sequencer | Live Training | Unreal Engine
- http://api.unrealengine.com/JPN/Engine/Sequencer/index.html
- https://www.youtube.com/watch?v=pKISyXhTugw
- Cinematics with Sequencer: Sequencer Editor Intro | 03 | v4.12 Tutorial Series | Unreal Engine
- 1 : https://www.youtube.com/watch?v=eaNCGmEJ_m0
- 2 : https://www.youtube.com/watch?v=GRgG2dqbuoQ&list=PLZlv_N0_O1gaiA_sfpjATUprVW7B9FcK1&index=2
- 3 https://www.youtube.com/watch?v=ryLbX3LIPw0
- 4 https://www.youtube.com/watch?v=e4fr23pVEuo
- 5 https://www.youtube.com/watch?v=q8g0oL9YRBs
- 6 https://www.youtube.com/watch?v=PcDvDrF5F0Q
- 7 https://www.youtube.com/watch?v=9MWy3qIL2gs
- 8 : https://www.youtube.com/watch?v=wK5qQRSs-J8
- 9 : https://www.youtube.com/watch?v=Q-BxhD0RTGA
- 10 : https://www.youtube.com/watch?v=eaNCGmEJ_m0
- 11 : https://www.youtube.com/watch?v=qJNKufrzRRk
- 12 : https://www.youtube.com/watch?v=K7xW7RyKNfg
Kandao Qoocam DNG8
MitstikaVR
UE4 GearVR
GearVR プロジェクト設定
グラフィックス設定
プロジェクト設定;最適化
- http://api.unrealengine.com/JPN/Platforms/GearVR/BestPractices/index.html
- https://docs.unrealengine.com/en-us/Platforms/VR/ContentSetup
- https://framesynthesis.jp/tech/unrealengine/performance/
- 使わないPlugins を除去
- [Packaging] 設定の一番下にある [Advanced Properties (詳細プロパティ)] -> [Exclude editor content when cooking] を True
- [Project Settings (プロジェクト設定)] の [Mobile] セクションの中にある Max Dynamic Point Lights を 4->0
- Gear VR プロジェクトは Mobile HDR に対応していませんので、有効にはしないでください。
- [ProjectSettings] > [Rendering] > [VR] を選択し、Mobile Multi-View] を有効, Mobile Multi-View Direct を有効、基本的には両方有効。
- MultiView を使うときに Debug 描画を有効にするには Debug Canvas in Layer を有効にする
-
- サポート状況
- For Gear VR, Multi-View is supported on Note5, S6, S7, S8, and S9 (and later) phones using ARM Exynos processors and running Android M or N. It is also supported on S7, S8, and S9 (and later) phones using Qualcomm processors and running Android N.
- Oculus Go and all supported Samsung phones support Multi-View with OpenGL ES 2.
- Oculus Go, S8, and S9 phones also support Multi-View with OpenGL ES 3.1.
- For Exynos devices, verify that Support OpenGL ES2 is checked in the Build section in Platforms > Android, and that Support OpenGL ES3 is not selected.
- サポート状況
- [Monoscopic Far Field] を有効
止まり
- Mobile Multi-View] を有効, Mobile Multi-View Direct を有効, Monoscopic Far Field を有効にすると, Galaxy S7 Edge だと止まる。Traslucency の描画あたり.
- Mobile Multi-View] を有効, Mobile Multi-View Direct を有効にすると, 画面が出ないことがある。メモリ不足?
- ProfileGPU をすると, 下のようなデバッグ出力後にハングしてアプリ終了
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: Perf marker hierarchy, total GPU time 0.00ms
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: WARNING: This GPU profile was captured with v-sync enabled. V-sync wait time may show up in any bucket, and as a result the data in this profile may be skewed. Please profile with v-sync disabled to obtain the most accurate data.
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: Warning: Profiled range "disjoinness" could not be determined due to lack of disjoint timer query functionality on this platform.
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: nan% 0.00ms FRAME 68 draws 12344 prims 37032 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: nan% 0.00ms MobileBasePass 10 draws 5798 prims 17394 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: nan% 0.00ms View0 5 draws 2899 prims 8697 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:862][550]LogRHI: nan% 0.00ms View1 5 draws 2899 prims 8697 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms Translucency 4 draws 2016 prims 6048 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms View0 2 draws 1008 prims 3024 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms View1 2 draws 1008 prims 3024 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms PostProcessing 49 draws 2038 prims 6114 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms View0 25 draws 1019 prims 3057 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms DistortionAccumulatePass 3 draws 1008 prims 3024 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms DistortionMergePass 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:863][550]LogRHI: nan% 0.00ms PostProcessBloomSetup 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomUp 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomUp 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessBloomUp 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms PostProcessSunMerge 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:864][550]LogRHI: nan% 0.00ms Tonemapper(ES2 FramebufferFetch=0) 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms View1 24 draws 1019 prims 3057 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms DistortionAccumulatePass 3 draws 1008 prims 3024 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms DistortionMergePass 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms PostProcessBloomSetup 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms PostProcessBloomDown 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:865][550]LogRHI: nan% 0.00ms PostProcessBloomUp 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12868 D UE4 : [2019.01.01-14.43.57:865][550]LogBlueprintUserMessages: [GearVR_Pawn_2] XY=-0.1190.956
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:866][550]LogRHI: nan% 0.00ms PostProcessBloomUp 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:866][550]LogRHI: nan% 0.00ms PostProcessBloomUp 2 draws 1 prims 3 verts
01-01 23:43:57.850 12851 12868 D UE4 : [2019.01.01-14.43.57:866][550]LogBlueprintUserMessages: [GearVR_Pawn_2] XY=-0.1060.956
01-01 23:43:57.850 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms PostProcessSunMerge 2 draws 1 prims 3 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms Tonemapper(ES2 FramebufferFetch=0) 1 draws 1 prims 3 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms RenderFinish 2 draws 184 prims 552 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms View0 1 draws 92 prims 276 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms CanvasFlush 1 draws 92 prims 276 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms View1 1 draws 92 prims 276 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms CanvasFlush 1 draws 92 prims 276 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms SlateUI 2 draws 2308 prims 6924 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms DrawDebugCanvas 2 draws 2308 prims 6924 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: nan% 0.00ms CanvasFlush 2 draws 2308 prims 6924 verts
01-01 23:43:57.860 12851 12957 D UE4 : [2019.01.01-14.43.57:867][550]LogRHI: Total Nodes 51 Draws 68
シーン
- Precomputed Visibility を使う, Static, ライトビルド
ポストプロセス
- AutoExposure のみ?
マテリアル
テクスチャ
動画再生
- Project Packaging の Exlude movie files when staging が有効だと, 動画が再生されない
描画負荷
コンソールコマンド
- VR Console Commands
- UE4のHMD関係のコンソールコマンドについて纏めました
- Stereo On/Off
- vr.PixelDensity
- VR パネルの解像度を設定します。品質をあげるには数値を高くします。パフォーマンスをあげるには数値を低くします。
「hmd stats」コマンドで、Oculus Riftの現在の設定(hmd pdの値、IPD、視野角、遅延など)を表示できます。消すには「hmd stats off」です。
- PC画面の表示方法を変えたい
- Set CPU and GPU Level
ハードウェア
Galaxy S7 Edge
- Samsung サムスン Galaxy S7 Edge Dual G935FD (Dual デュアル SIM) (並行輸入品) (32GB, ブラック)
- OS Android 6.0 (Marshmallow), upgradable to Android 8.0 (Oreo)
- Chipset
- Exynos 8890 Octa (14 nm) - G935FD, G935F, G935W8
- CPU
- Octa-core (4x2.3 GHz Mongoose & 4x1.6 GHz Cortex-A53) - G935FD, G935F, G935W8
- GPU
- ETC2 : OpenGL ES 3.0,
- ASTC : OpenGL ES 3.2 が正式
- 3600 mAh
- . Poweradd Pilot X7 20000mAh
Oculus Go (Galaxy S7)
- Snapdragon 820/Snapdragon 821 Adreno 530
- GFLOPS : 407.4/498.5/519.2
- Core : 256, Mhz : 510/624/650
- OpenGLES : 3.2 (3.1 + AEP), DX12.1
- 4000mAh
- https://ja.wikipedia.org/wiki/Adreno
- 天球のテクスチャが粗く見えることがあった
- どうやら UV の精度不足
- マテリアルで Mobile -> Use Full Precision を有効にしたら直った
- https://answers.unrealengine.com/questions/813917/oculus-go-texture-resolution-degrades-over-time.html
- Mobile MSAA を使うと画面が真っ暗になる
Stereo Layer
- GearVR や Oculus Go でまともに使えるのは World 固定のもの
- Cubemap とか使えなかった
Oculus Quest (Galaxy S8)
- Snapdragon 835 Adreno 540
- GFLOPS : ???
Galaxy S9
- Snapdragon 845 Adreno 630
- GFLOPS : 727
- ALU : 256, 710 MHz
- OpenGL ES 3.2, DX12.1
S7 Edge
- r.BloomQuality (RHI)
- 0 : 22.0msec
- 1 : 28msec
- 2-5 : 29msec
- MobileMSAACount
- 1 : 22 msec
- 2/4 : 24 msec
Google Pixel C
- 10.2 in (260 mm) 1:√2 (64:45) aspect ratio, 308 ppi pixel density 2560x1800 px backlit
- https://androidlover.net/google-pixel-chttps://en.wikipedia.org/wiki/Pixel_C
iPad Pro 12.9
- Liquid Retinaディスプレイ,12.9インチ(対角), 2,732 x 2,048ピクセル解像度, 264ppi
- 広色域ディスプレイ(P3), True Toneディスプレイ
Stereo Pano Ue4
- Stereo Panoramic Capture ツール
- [UE4] 360°動画の作り方
- Stereo Pano Camera Tutorial for UE4
- UE4 で 360 度ステレオスコピックのスクリーンショットとムービーをキャプチャーする
- Stereo Panoramic Plugin | Live Training | Unreal Engine
- https://www.youtube.com/watch?v=VfuQv_5RDRA
- Blueprint Blending Between Multiple Skylight Cubemaps | Live Training | Unreal Engine
VR Motion Controller Laser Beam
Oculus Go UE4
UE4 Oculus Rift
- Oculus Rift 向けの開発
- Unreal Engine 4 + Oculus Rift + Touch でVRコンテンツを作成する
- UE4+Oculus Rift開発メモ
- Render To Texture Blueprint Toolset
- [UE4]マテリアル内で色調整を行ったテクスチャをベイクして処理負荷削減!
Transparent
Xkeymacs windows7
- https://blog.kakakikikeke.com/2012/04/xkeymacswindows7.html
- Windows 10 で Emacs キーバインディングを利用するための設定のメモ
- ChangeKey
- Emacs使いがWindows10で行う初期設定メモ
UDataAsset
- https://api.unrealengine.com/INT/API/Runtime/Engine/Engine/UDataAsset/index.html
- http://historia.co.jp/archives/7564/
- http://historia.co.jp/archives/6567/
- http://api.unrealengine.com/INT/Programming/Assets/ReferencingAssets/index.html
- http://api.unrealengine.com/INT/Engine/Basics/AssetsAndPackages/AssetManagement/index.html
- http://api.unrealengine.com/INT/Programming/Assets/AsyncLoading/index.html
Landscape
UE4 VS
- http://api.unrealengine.com/JPN/Programming/Development/VisualStudioSetup/
- http://api.unrealengine.com/JPN/Programming/Development/VisualStudioSetup/UnrealVS/index.html
- https://docs.unrealengine.com/en-us/Programming/Development/VisualStudioSetup
- http://papersloth.hatenablog.com/entry/ac2017
- http://papersloth.hatenablog.com/entry/2017/11/15/232411
Visual Studio Snippets for Unreal Engine C++ Projects =========== How to install snippets? =========== **Method One** Paste .snippet files into: C:\Users\$user$\Documents\Visual Studio 2013\Code Snippets\Visual C++\My Code Snippets. Then restart VS. **Method Two** Open Visual Studio, navigate to TOOLS -> Code Snippets Manager… -> Import… How to use snippets? =========== **Method One** Just start typing ue4... snippet list should be loaded in a form of combo box. Then use arrows to select snippet. Hit ENTER or TAB to insert snippet. **Method Two** Type all snippet name and hit TAB. You don't have to wait for VS to show snippet list. To navigate between highlighted fields you can use TAB and SHIFT + TAB. After you enter all names, hit ENTER. Snippets =========== *ue4classa* – Blueprintable class that derives from an AActor. Parameters are: comment, class name and base class name. *ue4classu* – Blueprintable class that derives from an UObject. Parameters are: comment, class name, base class name. *ue4struct* – Simple structure. Parameters are: comment and name. *ue4interface* – Simple ue4 interface. Parameters are: comment and name. *ue4bpevent* – This function can be used as an event in blueprint. Parameters are: comment, UI category, virtual and const modifiers, function name and arguments. *ue4bpfunc* – This function is available for blueprint logic. Parameters are: comment (parameters and return value), UI category, virtual and const modifiers, function name and arguments. *ue4prop* – This read/write property is available everywhere (blueprint, instance and archetype details). Parameters are: comment, category, type and name. *ue4enum* – Simple enum. Parameters are: comment, enum name, first member name and it’s comment. *ue4enumdisplay* – Enum that can be used with blueprints. Parameters are: comment, enum name, first member name, it’s display name and comment. *ue4log* – Simplest log line. Parameters are category, verbosity and message. *ue4logdeclare* – Declaration of log category. Place this in main header of your project to allow logging. Parameters are: category, default verbosity and compile time verbosity. *ue4logdefine* – Definition of log category. Place this in main code file. Parameter is category name. *ue4logfloat* – Log line that can be used to print float value. Parameters are: category, verbosity and variable name. *ue4logint* – This log line can be used to log an integer value. Parameters are: category, verbosity and variable name. *ue4loguobj* – This log line is designed to log from inside of the objects. By default, square brackets contains a name of an object that writes the log. Parameters are: category, verbosity, message and name of a pointer to the object. *ue4mark* – Can be used to mark changes in engine classes. Parameters are: Company symbol, task/ticket number, name and surname of a developer and short description of modification. *ue4eve* - 9 snippets for each params combination. Can be used to create event. Parameters are: owning type and event type name. *ue4del* - 9 snippets for each params combination. Can be used to create delegate. Parameters are: delegate type name and param type names. *ue4delmul* - 9 snippets for each params combination. Can be used to create multicast delegate. Parameters are: delegate type name and param type names. *ue4deldyn* - 9 snippets for each params combination. Can be used to create dynamic delegate. Parameters are: delegate type name, param type names and display values. *ue4deldynmul* - 9 snippets for each params combination. Can be used to create dynamic multicast delegate. Parameters are: delegate type name, param type names and display values`.
Dynamic Material Instance
UE4 ScreenShot
Instancing
- https://answers.unrealengine.com/questions/502169/customise-uvs-per-instance-with-instanced-static-m.html
- https://markdownshare.com/view/a4b04e89-83bc-48bd-8ec5-229ab407f275
- https://answers.unrealengine.com/questions/280115/is-it-possible-to-use-instanced-material-on-instan.html
- https://github.com/ufna/VaOcean/tree/master
- https://github.com/IntelSoftware/ue4-parallel
- https://software.intel.com/en-us/articles/code-sample-an-approach-to-parallel-processing-with-unreal-engine
- https://software.intel.com/en-us/articles/unreal-engine-4-parallel-processing-school-of-fish
- asfloat : https://docs.microsoft.com/en-us/windows/desktop/direct3dhlsl/dx-graphics-hlsl-asfloat
- asint : https://docs.microsoft.com/en-us/windows/desktop/direct3dhlsl/dx-graphics-hlsl-asint
- PerInstanceRandom : http://api.unrealengine.com/JPN/Engine/Rendering/Materials/ExpressionReference/Constant/
Memo
Multithread Game architecture
- "Parallelizing the Naughty Dog engine using fibers" GDC2015
- "Destiny’s Multi-threaded Renderer Architecture" GDC2015
- "FRAMEGRAPH: Extensible Rendering Architecture in Frostbite" GDC2017
- Custom Material Node: How to use and create Metaballs | Live Training | Unreal Engine
- "Unreal Engine 4 Custom Shaders Tutorial"
UE4 sample c++
UE4 Console variables
static const auto CVar = IConsoleManager::Get().FindTConsoleVariableDataInt(TEXT("r.VSync")); // Limit framerate on console if VSYNC is enabled to avoid jumps from 30 to 60 and back. if( CVar->GetValueOnGameThread() != 0 ) { // do something } ---------------------------------------------------- // Use CommandLine command // optionally set the vsync console variable if( FParse::Param(FCommandLine::Get(), TEXT("vsync")) ) { new(GEngine->DeferredCommands) FString(TEXT("r.vsync 1")); }
Async
- Using Async Collision Traces in Unreal Engine 4
- Can an Async Line Trace be done in Blueprint?
- What is the async scene?
Notes of PostProcessing in UE4
- APostProcessVolume
- FPostProcessSettings
- FFinalPostProcessSettings
Windows10にアップグレード、Wi-Fiが途切れる場合に試してみること
UE4 Physical Materials User Guide
UE4 OpenWorld Collections
UE4.18 ソースのノート
FDeferredShadingSceneRenderer::Render( ) をざっくり
// DeferredShadingRenderer.cpp // プリパスの前の処理 : DepthRendering.cpp RenderPrepass() // 半透明ライティング用のライトグリッド : LightGridInjection.cpp ComputeLightGrid() // オクルージョンクエリ RenderOcclusion() FinishOcclusion() // デプスシャドウ : ShadowDepthRendering.cpp RenderShadowDepthMaps() // ボリューメトリックフォグ : VolumetricFog.cpp ComputeVolumetricFog() if ( forward ) { // フォワードシャドウ投影 : RenderForwardShadingShadowProjections() // カプセルシャドウ Indirect : CapsuleShadows.cpp RenderIndirectCapsuleShadows() } // GBuffer クリア SetAndClearViewGBuffer() // ベースパスの描画 : BaseRendering.cpp RenderBasePass() // カスタムデプス RenderCustomDepthPassAtLocation() // 速度の描画 : VelocityRendering.cpp RenderVelocities() // ベースパス後のポストプロセス : CompositionLighting.cpp GCompositionLighting.ProcessAfterBasePass() // ディファードのライティング処理 if ( deferred ) { // カプセルシャドウ Indirect : CapsuleShadowRendering.cpp RenderIndirectCapsuleShadows() // DFAO RenderDFAOAsIndirectShadowing() // ライティング : LightRendering.cpp RenderLights(): // ライトグリッドのフィルタリング : TranslucentLighting.cpp FilterTraslucentVolumeLighting() // 動的スカイライト : DistanceFieldAmbientOcclusion.cpp RenderDynamicSkyLighting() // リフレクション : ReflectionEnvironment.cpp RenderDeferredReclections() // スクリーンスペース SSS とか : CompositionLighting GCompositionLighting.ProcessAfterLighting() } // ライトシャフトのオクルージョン : LightShaftRendering.cpp RenderLightShaftOcclusion() // 大気散乱フォグ : AtmosphericRendering.cpp RenderAtmosphere() // フォグ : FogRendering.cpp RenderFog() // 半透明の描画 : TranslucentLighting.cpp RenderTranslucency() // 屈折 : DistortionRendering.cpp RenderDistortion() // ブルーム付きライトシャフト : LightShaftRendering.cpp RenderLightsShaftBloom() // DistanceField RenderDistanceFieldLighting() // ポストプロセス : PostProcessing.cpp GPostProcessing.Process() }
FDeferredShadingSceneRenderer::RenderLights() : LightRendering.cpp
if (bAllowSimpleLights) { GatherSimpleLights(ViewFamily, Views, SimpleLights); // LightRendering.cpp } // Build a list of visible lights. for (TSparseArray<FLightSceneInfoCompact>::TConstIterator LightIt(Scene->Lights); LightIt; ++LightIt) { ... } // ソート SortedLights.Sort( FCompareFSortedLightSceneInfo() ); // Iterate over all lights to be rendered and build ranges for tiled deferred and unshadowed lights for (int32 LightIndex = 0; LightIndex < SortedLights.Num(); LightIndex++) { ... } if(ViewFamily.EngineShowFlags.DirectLighting) { if (ShouldUseTiledDeferred(SupportedByTiledDeferredLightEnd, SimpleLights.InstanceData.Num()) && !bAnyUnsupportedByTiledDeferred && !bAnyViewIsStereo) { RenderTiledDeferredLighting(); } if (bRenderSimpleLightsStandardDeferred) { RenderSimpleLightsStandardDeferred(RHICmdList, SimpleLights); } // Draw non-shadowed non-light function lights without changing render targets between them for (int32 LightIndex = StandardDeferredStart; LightIndex < AttenuationLightStart; LightIndex++) { const FSortedLightSceneInfo& SortedLightInfo = SortedLights[LightIndex]; const FLightSceneInfo* const LightSceneInfo = SortedLightInfo.LightSceneInfo; // Render the light to the scene color buffer, using a 1x1 white texture as input RenderLight(RHICmdList, LightSceneInfo, NULL, false, false); } if (GUseTranslucentLightingVolumes && GSupportsVolumeTextureRendering) { if (AttenuationLightStart) { // Inject non-shadowed, non-light function lights in to the volume. SCOPED_DRAW_EVENT(RHICmdList, InjectNonShadowedTranslucentLighting); InjectTranslucentVolumeLightingArray(RHICmdList, SortedLights, AttenuationLightStart); } if (SimpleLights.InstanceData.Num() > 0) { SCOPED_DRAW_EVENT(RHICmdList, InjectSimpleLightsTranslucentLighting); InjectSimpleTranslucentVolumeLightingArray(RHICmdList, SimpleLights); } } } // Draw shadowed and light function lights for (int32 LightIndex = AttenuationLightStart; LightIndex < SortedLights.Num(); LightIndex++) { if (bDrawShadows) { RenderShadowProjections(RHICmdList, &LightSceneInfo, ScreenShadowMaskTexture, bInjectedTranslucentVolume); } // Render light function to the attenuation buffer. if (bDirectLighting) { if (bDrawLightFunction) { const bool bLightFunctionRendered = RenderLightFunction(RHICmdList, &LightSceneInfo, ScreenShadowMaskTexture, bDrawShadows, false); bUsedShadowMaskTexture |= bLightFunctionRendered; } if (bDrawPreviewIndicator) { RenderPreviewShadowsIndicator(RHICmdList, &LightSceneInfo, ScreenShadowMaskTexture, bUsedShadowMaskTexture); } } // Render the light to the scene color buffer, conditionally using the attenuation buffer or a 1x1 white texture as input if(bDirectLighting) { RenderLight(RHICmdList, &LightSceneInfo, ScreenShadowMaskTexture, false, true); } }
FSceneRenderer::GatherSimpleLights( )
void FSceneRenderer::GatherSimpleLights(const FSceneViewFamily& ViewFamily, const TArray<FViewInfo>& Views, FSimpleLightArray& SimpleLights) { TArray<const FPrimitiveSceneInfo*, SceneRenderingAllocator> PrimitivesWithSimpleLights; // Gather visible primitives from all views that might have simple lights for (int32 ViewIndex = 0; ViewIndex < Views.Num(); ViewIndex++) { const FViewInfo& View = Views[ViewIndex]; for (int32 PrimitiveIndex = 0; PrimitiveIndex < View.VisibleDynamicPrimitives.Num(); PrimitiveIndex++) { const FPrimitiveSceneInfo* PrimitiveSceneInfo = View.VisibleDynamicPrimitives[PrimitiveIndex]; const int32 PrimitiveId = PrimitiveSceneInfo->GetIndex(); const FPrimitiveViewRelevance& PrimitiveViewRelevance = View.PrimitiveViewRelevanceMap[PrimitiveId]; if (PrimitiveViewRelevance.bHasSimpleLights) { // TArray::AddUnique is slow, but not expecting many entries in PrimitivesWithSimpleLights PrimitivesWithSimpleLights.AddUnique(PrimitiveSceneInfo); } } } // Gather simple lights from the primitives for (int32 PrimitiveIndex = 0; PrimitiveIndex < PrimitivesWithSimpleLights.Num(); PrimitiveIndex++) { const FPrimitiveSceneInfo* Primitive = PrimitivesWithSimpleLights[PrimitiveIndex]; Primitive->Proxy->GatherSimpleLights(ViewFamily, SimpleLights); } }
FPostProcessing::Process( ) をざっくり
// PostProcessing.cpp FRenderingCompositePassContext CompositeContext(RHICmdList, View); FPostprocessContext Context(RHICmdList, CompositeContext.Graph, View); ... if (AllowFullPostProcessing(View, FeatureLevel)) { AddPostProcessMaterial(Context, BL_BeforeTranslucency, SeparateTranslucency); .... // ガウシアン DOF の場合 if(VelocityInput.IsValid()) { bSepTransWasApplied = AddPostProcessDepthOfFieldGaussian(Context, DepthOfFieldStat, VelocityInput, SeparateTranslucency); } else { FRenderingCompositePass* NoVelocity = Context.Graph.RegisterPass(new(FMemStack::Get()) FRCPassPostProcessInput(GSystemTextures.BlackDummy)); FRenderingCompositeOutputRef NoVelocityRef(NoVelocity); bSepTransWasApplied = AddPostProcessDepthOfFieldGaussian(Context, DepthOfFieldStat, NoVelocityRef, SeparateTranslucency); } .... if(SeparateTranslucency.IsValid() && !bSepTransWasApplied) { // separate translucency is done here or in AddPostProcessDepthOfFieldBokeh() FRenderingCompositePass* NodeRecombined = Context.Graph.RegisterPass(new(FMemStack::Get()) FRCPassPostProcessBokehDOFRecombine(bIsComputePass)); NodeRecombined->SetInput(ePId_Input0, Context.FinalOutput); NodeRecombined->SetInput(ePId_Input2, SeparateTranslucency); Context.FinalOutput = FRenderingCompositeOutputRef(NodeRecombined); } AddPostProcessMaterial(Context, BL_BeforeTonemapping, SeparateTranslucency); // テンポラル AA if( AntiAliasingMethod == AAM_TemporalAA && ViewState) { if(VelocityInput.IsValid()) { AddTemporalAA( Context, VelocityInput ); } else { FRenderingCompositePass* NoVelocity = Context.Graph.RegisterPass(new(FMemStack::Get()) FRCPassPostProcessInput(GSystemTextures.BlackDummy)); FRenderingCompositeOutputRef NoVelocityRef(NoVelocity); AddTemporalAA( Context, NoVelocityRef ); } } // 自動露出のヒストグラム // ダウンサンプリング // 自動露出補正 // ブルーム // トーンマッパー auto Node = AddSinglePostProcessMaterial(Context, BL_ReplacingTonemapper); if(Node) { // a custom tonemapper is provided Node->SetInput(ePId_Input0, Context.FinalOutput); // We are binding separate translucency here because the post process SceneTexture node can reference // the separate translucency buffers through ePId_Input1. // TODO: Check if material actually uses this texture and only bind if needed. Node->SetInput(ePId_Input1, SeparateTranslucency); Node->SetInput(ePId_Input2, BloomOutputCombined); Context.FinalOutput = Node; } else { Tonemapper = AddTonemapper(Context, BloomOutputCombined, AutoExposure.EyeAdaptation, AutoExposure.MethodId, false, bHDRTonemapperOutput); } // FXAA } else { } // 可視化 ... if ( dbDOScreenPercentage ) { // アップスケール FRenderingCompositePass* Node = Context.Graph.RegisterPass(new(FMemStack::Get()) FRCPassPostProcessUpscale(View, UpscaleQuality, PaniniConfig)); Node->SetInput(ePId_Input0, FRenderingCompositeOutputRef(Context.FinalOutput)); // Bilinear sampling. Node->SetInput(ePId_Input1, FRenderingCompositeOutputRef(Context.FinalOutput)); // Point sampling. Context.FinalOutput = FRenderingCompositeOutputRef(Node); } Context.FinalOutput = AddPostProcessMaterialChain(Context, BL_AfterTonemapping, SeparateTranslucency, PreTonemapHDRColor, PostTonemapHDRColor); // グラフを実行 // execute the graph/DAG CompositeContext.Process(Context.FinalOutput.GetPass(), TEXT("PostProcessing")); ....
Notes of RenderThread
- WindowsRunnableThread.cpp
- (0) uint32 FRunnableThreadWin::GuardedRun()
- (1) uint32 FRunnableThreadWin::Run()
- RenderingThread.cpp
- (2) virtual uint32 RenderingThread::Run(void) override
- (3) void RenderingThreadMain( FEvent* TaskGraphBoundSyncEvent )
- TaskGraph.cpp
- (4) virtual void FNamedTaskThread::ProcessTasksUntilQuit(int32 QueueIndex) override
- (5) void FNamedTaskThread::ProcessTasksNamedThread(int32 QueueIndex, bool bAllowStall)
- TaskGraphInterfaces.h
- (6) virtual void TGraphTask::ExecuteTask(TArray
& NewTasks, ENamedThreads::Type CurrentThread) final override
- (6) virtual void TGraphTask::ExecuteTask(TArray
- SceneRendering.cpp
- (7) static void RenderViewFamily_RenderThread(FRHICommandListImmediate& RHICmdList, FSceneRenderer* SceneRenderer)
- DeferredShading.cpp
- (8) void FDeferredShadingSceneRenderer::Render(FRHICommandListImmediate& RHICmdList)
Forward shading
Game 側
UWorld
- https://docs.unrealengine.com/latest/INT/API/Runtime/Engine/Engine/UWorld/
- The World is the top level object representing a map or a sandbox in which Actors and Components will exist and be rendered.
- A World can be a single Persistent Level with an optional list of streaming levels
- that are loaded and unloaded via volumes and blueprint functions or it can be a collection of levels organized with a World Composition.
- In a standalone game, generally only a single World exists except during seamless area transitions
- when both a destination and current world exists. In the editor many Worlds exist:
- The level being edited, each PIE instance, each editor tool which has an interactive rendered viewport, and many more.
ULevel
- http://api.unrealengine.com/INT/API/Runtime/Engine/Engine/ULevel/index.html
- The level object. Contains the level's actor list, BSP information, and brush list.
- Every Level has a World as its Outer and can be used as the PersistentLevel,
- however, when a Level has been streamed in the OwningWorld represents the World that it is a part of.
- A Level is a collection of Actors (lights, volumes, mesh instances etc.).
- Multiple Levels can be loaded and unloaded into the World to create a streaming experience.
UPrimitiveComponent : 描画や物理と相互作用する基底クラス, カリングの単位
- https://docs.unrealengine.com/latest/INT/API/Runtime/Engine/Components/UPrimitiveComponent/index.html
- PrimitiveComponents are SceneComponents that contain or generate some sort of geometry,
- generally to be rendered or used as collision data.
- There are several subclasses for the various types of geometry,
- but the most common by far are the ShapeComponents (Capsule, Sphere, Box),
- StaticMeshComponent, and SkeletalMeshComponent.
- ShapeComponents generate geometry that is used for collision detection but are not rendered,
- while StaticMeshComponents and SkeletalMeshComponents contain pre-built geometry that is rendered, but can also be used for collision detection.
- ULightComponent : ライト用のクラス
- FSceneView : FScene を描画する 1 視点
RenderThread 側
- FScene : UWorld のレンダースレッド版
USceneComponent : FScene に追加する要素の基底クラス
- https://docs.unrealengine.com/latest/INT/API/Runtime/Engine/Components/USceneComponent/index.html
- A SceneComponent has a transform and supports attachment, but has no rendering or collision capabilities.
- Useful as a 'dummy' component in the hierarchy to offset others.
FPrimitiveSceneProxy : UPrimitiveComponent の RenderThread 版
- https://docs.unrealengine.com/latest/INT/API/Runtime/Engine/FPrimitiveSceneProxy/
- Encapsulates the data which is mirrored to render a UPrimitiveComponent parallel to the game thread.
- This is intended to be subclassed to support different primitive types.
FPrimitiveSceneInfo : レンダラーの内部状態,UPrimitiveComponent と FPrimitiveSceneProxy
- https://docs.unrealengine.com/latest/INT/API/Runtime/Renderer/FPrimitiveSceneInfo/
- The renderer's internal state for a single UPrimitiveComponent .
- This has a one to one mapping with FPrimitiveSceneProxy , which is in the engine module.
FViewInfo : FSceneView のレンダースレッド版
FSceneViewState : 複数フレームに渡るビューの情報を保存するもの
- http://api.unrealengine.com/INT/API/Runtime/Engine/FSceneViewStateInterface/index.html
- The scene manager's persistent view state.
FSceneRenderer : 毎フレーム作られる一時状態
マテリアル
FMaterial : 描画用のマテリアルのインターフェース
FMaterialResource : FMaterial の実装
FMaterialRenderProxy : レンダラー版
UMaterialInterface : ゲームスレッド版
UMaterial : マテリアルのアセット
UMaterialInstance : マテリアルのインスタンス
UMaterialInstanceConstant : マテリアルの定数
UMaterialInstanceDynamic : ランタイムで変えられる
Using Instanced Static Meshes in C++
DebugPrint
#include "Engine.h"
if ( GEngine != nullptr )
{
FString str = FString::Printf( TEXT( "last_render_time = %f" ), render_time );
GEngine->AddOnScreenDebugMessage( 2, 1.0f, FColor::Red,
str );
}
UE4 RenderTask
- class FRenderBasePassDynamicDataThreadTask : public FRenderTask
- class FRenderPrepassDynamicDataThreadTask : public FRenderTask
- class FDrawShadowMeshElementsThreadTask : public FRenderTask
- class FRenderDepthDynamicThreadTask : public FRenderTask
- class FDrawSortedTransAnyThreadTask : public FRenderTask
- class FRenderVelocityDynamicThreadTask : public FRenderTask