![]() Operating Modes enables a single panel type to have multiple Operating Modes, each with a slightly different configuration for the panel. ![]() R2+ is backwards-compatible with all existing R2-based panel designs, but brings additional data pins and capability to support new LED drivers, ensuring continued support for more panel types. The new Tessera R2+ receiver card has the same SO-DIMM form-factor as the R2. Finally, in collaboration with Mo-Sys, Brompton has added support for the company’s StarTracker camera tracking system when using panels fitted with the Tessera R2 or Brompton’s new Tessera R2+ and an SX40 or S8 processor with Frame Remapping. Additionally, the Tessera SX40 and S8 now have a preset to support the ACES AP1 colour space, as used by ACEScg. New features include Operating Modes, and Per-Batch PureTone. Some show off tech, some show off creativeness, some are immersive as hell, and some will just bop to the right tunes.LED video processing products manufacturer, Brompton Technology, has outlined features coming with Tessera v3.3 software, designed to optimise the performance of LED panels. Or do funky geo and paint it with Resolume, but also have some geometry animations and stuff.Īnd actually, probably all of the above if you have enough time. Or, do camera tracking, some funky geo in unreal, actual forced-perspective, and use Resolume to paint the geometry. Or just barf content onto all of it, and rely on the unique screen to be the interesting thing (although, having the tech being interesting is pretty lazy). I'd probably concentrate on the wall, use the ceiling as eye candy, and have some more custom built things that wrap onto the ceiling in a psuedo-forced-perspective thing. I presume you are using quadros & sync cards on the render machines, right? Otherwise you are gonna have some headaches with joint tearing, outside of software frame sync.Īs for actually VJing it. Not sure how nitty-gritty unreal lets you get with pixels and data of textures. How good is your C#?Ĭould do something interesting like encoding a timecode into some of the pixels or something. Probably gonna be more of a DIY thing, tho. If TD can do NDI metadata frame sync, then I'm sure there's a solution for unreal that you can figure out. ![]() So maybe using TD on the render machines for framesync, and use Spout/Syphon to get Resolume into TD to convert to NDI on the VJ machine. I think TouchDesigner has started to roll out frame sync metadata for NDI. I'd also do it as multicast NDI (if thats possible), as the Resolume machine may just get pegged trying to render that, never mind sending it for however many receivers.īut honestly, chances are NDI is not going to work well for this, tho. If you can stretch to 25gbps or 40gbps, then doubling the resolution might be an idea, just so the final render is using high quality textures.Īlthough, I'd probably go with 25gbps, cause I dunno if QSFP is actually 40gbps for a single stream, or if it's 4x10gbps (which would be fine for multiple streams, but not for 1 big chungus of a stream).Įvery render server receives the entire canvas, and displays just their slice of it. That will be about 7.5gbps after NDI, so 10gbps networking is a must. I'd use Resolume to send out 12K X 5K as NDI.ġ2k X 2.5k as the wall wrap. If no camera tracking, then probably just concentrate on doing the wrap, and treat the ceiling as more eye-candy ![]() Set up some interesting geometry, and render the VJing onto that. If you have camera tracking, I'd let Unreal manage the final display. The ceiling looks more traditional 4 of 3840x2160 panels.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |