ps. By the way, CT ... when AppCleaner was running, it listed a few plists - like you mentioned - that I was unable to locate using finder or looking through folders. I don't know how SelfControl works to hide its stuff, but it works.
Nope. The idea behind SelfControl is that it's hard to crack so you don't have easy access to the web. Supposedly you can even delete the app and it still works until the timer runs out. In my case, the timer ran out, gave me access to the web again, but I can't get the app to work anymore. It's stuck at zero.
Version 17.4.6:Key FeaturesHardware accelerated Apple ProRes on Apple M1 Pro and M1 Max.Faster DaVinci Neural Engine performance on Mac OS Monterey.Native HDR viewers and 120 Hz playback on supported MacBook Pros.Native Dropbox and Dropbox Replay integration with render presets.Sync markers, comments, annotations with Dropbox and Dropbox Replay.Export timeline markers as YouTube video or QuickTime chapters.Steinberg VST3 support with access to even more audio effects.Simplified auto color management settings with SDR and HDR selection.Improved 3D keyer and matte finesse controls.New Resolve FX including film halation and custom mixers.Text+ support for combined glyphs, right to left text and vertical layouts.Subtitles can auto resize backgrounds and decompose to parent timeline.Apple M1 Pro and M1 MaxHardware accelerated Apple ProRes on Apple M1 Pro and M1 Max.Faster DaVinci Neural Engine performance on Mac OS Monterey.Smoother 120 Hz UI and playback on M1 Pro and M1 Max MacBook Pros.Native HDR viewers on M1 Pro and M1 Max MacBook Pros.Native full screen mode on Mac OS.DropboxDropbox login within DaVinci Resolve preferences.Render presets for Dropbox and Dropbox Replay with background uploads.Sync comments and annotations with Dropbox Replay in Studio.Sync markers and comments with Dropbox in Studio.EditSubtitle and caption backgrounds now auto-resize to fit text content.Subtitle tracks in nested timelines now decompose to the main timeline.Adding a new subtitle caption now auto-focuses on the text area.Simple titles and subtitles are faster on Apple Silicon systems.Improved ease in and out functionality for position curves in the timeline.Options to include effects and grades for render in place operations.Switch multicam angles in the edit page with the speed editor.Ability to mark selection for timeline gaps.Edit asymmetric audio transitions created in the Fairlight page.Trim video and audio transitions asymmetrically using cmd/ctrl.Fine audio clip gain adjustments using shiftmouse drag.Support for pasting retime attributes on audio clips.Option to limit audio sync to the first timecode match.Preview composite modes by hovering over each mode in the inspector.Ability to set per-clip deinterlace quality in the inspector.New square iris transition.Support for custom aspect ratio controls for shape transitions.Improved overlays for Fusion tools in the viewer.Improved undo support for Fusion effects and Textin the inspector.Support for folder based organization of effect templates.New customizable key actions to go to previous/next timeline tabs.Ability to close timeline tabs with middle click.Preview generators and titles from the effects panel in the cut viewer.ColorSupport for an automatic mode for color managed projects.Support for ACES 1.3, gamut compression and new CSC transforms.New 3D Keyer with new modes, better selection/stroke logic, live feedback.Improved HSL and luma keyers with updated matte finesse controls.Track forward and back with a single action in trackers and magic masks.Node tooltips now indicate LUT and effect type present.Dragging new links to layer and key mixers auto-creates node inputs.Dragging color nodes over key links creates key-to-RGB connections.Added individual primary and secondary tool icons for faster switching.Clip filters for timeline clips with Dolby Vision analysis or trim.Disabled clips are now shown as gray in the timeline.Support for applying camera LUTs and CDLs to ARRI MXF ProRes clips.The printer light state is now persisted across application restart.Navigating to markers in the timeline now auto scrolls to center marker.
Added support for multi-GPU rendering (mGPU). SLI is required. To enable it, pass -MaxGPUCount=2 on the Unreal Editor command line and set r.PathTracing.MultiGPU=1, which can be toggled at runtime. (r.AllowMultiGPUInEditor=1 is no longer required in .ini files to enable mGPU in the editor.)
On-Demand Shader Compilation (ODSC) compiles only the shaders needed to render what is seen on screen in the editor, and during iterative platform development using cook-on-the-fly. ODSC can significantly reduce the number of shaders to be compiled for those who routinely sync their builds and have a large number of shaders to compile, those who iterate on materials and shaders often, and for anyone without access to a remote DDC for cached shaders.
While ODSC can reduce shader compilation for most working on large projects that regularly compile large numbers of shaders, it does not currently reduce the number of "global shaders" that are compiled during the initial startup of the editor.
The Light Mixer is a new editor window that displays all lights in the scene in a compact tabular format for rapid inspection and editing, similar to how the Environment Light Mixer includes scene environment lighting components.
Data Layers is a system designed to conditionally load and unload your world data by toggling data layers in the editor and at runtime. Data Layers are an excellent way to organize your world in the editor to handle different scenarios in your game, and create variations of the same world.
You can now edit Simple Curves and Rich Curves in the Curve Data Table Editor. When creating a Curve Table, you can open the editor to edit curves in the table or curve view without needing to go back to the external program you created them with.
When working with Assets, users only need a minimal amount of information about their assets to display them in the editor. For example, a Texture may need a thumbnail image and editor properties (such as coordinates and scale), however the bulk of the .UAsset file for textures is pixel data, which may not be needed and would be wasteful to have your team sync to your project.
Unreal Engine 5.1 features a new Import Framework designed to provide users with a high performance, customizable asset pipeline. The framework works in the editor and at runtime, as well as provides scripting support using Blueprint and Python.
Introduced as Experimental in 5.0, the UV Editor has moved to Beta in 5.1. Previously, the UV Editor was accessible through a plugin. Now, you can quickly access the editor in three ways:
Packaged as a plugin, the ML Deformer Framework uses the existing Neural Network Inference (NNI) framework as well as a reusable asset type and editor, that you can use to train, inspect, and debug ML Deformer models that deform meshes by evaluating the neural networks at runtime.
In addition to the introduction of the ML Deformer framework and the Neural Morph Model, the ML Deformer asset editor has received several performance and quality-of-life improvements.
Previously, DMX support was enabled by means of the DMX plugin, and the send / receive functionalities were only enabled in Play / Preview / Game or Packaged modes. We added support for being live while in editor, enabling users to drive in-editor elements using DMX.
In 5.1, we introduced support to decode Pixel Streaming video streams in-engine. This makes it possible to provide streaming between multiple applications and potentially between editors. This feature is accompanied by new blueprint nodes to set up stream playback in the engine without writing any C++ code. Currently, you can play back streams using a special material.
In the past, Unreal Engine users could stream packaged Unreal Engine applications and games. Due to the increase in remote working and remote collaboration, we introduced experimental support to Pixel Streaming to stream the entire Unreal Editor itself. Streaming the editor can be tested out using the new Pixel Streaming toolbar now visible when Pixel Streaming is enabled. Or, if you are launching in an unattended setup, you can launch with -EditorPixelStreamingStartOnLaunch. Running editor streaming works the same way as normal Pixel Streaming, and the signaling server is still required.
There are some limitations around the editor experience. For example, currently multi-monitor streaming is not yet supported. In general this feature is an experimental preview, we intend to expand the feature set and user experience in subsequent releases; however, users should be advised to take caution if building against this feature as it will likely change.
Previous versions of Unreal Engine supported building projects for Apple's ARM-64 architecture, but Unreal Editor itself was not natively built for it and depended on the Rosetta instruction translator when running on Apple Silicon devices. UE 5.1 rolls out an experimental version of native support for Apple Silicon in Unreal Editor, meaning that M1 devices and later should see improved performance when running the editor. 2b1af7f3a8