Smal Pixel mapping

David KromDavid Krom Registered User
is there in the next update a pixel mapping or what are the plans?
it will be useful even if you have only 20 sunstrips or led fixtures
the video you do with a media server but for those little things very handy:)
«1

Comments

  • Michael_GrahamMichael_Graham Registered User, Super Moderator, HES Staff
    edited January 2014
    Hello David,

    Pixel mapping is something we have talked about but I do not know when it we will do it, sorry.
  • SebsworldSebsworld Registered User, Hog Beta
    edited February 2014
    I think this type of feature would have been useful on all the shows I have done in the last 12 months.
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Just a question:
    How large are the arrays we are talking about here?
    Are you talking about some bitmaps that get animated? Running movies?

    When I need to do pixel-mapping I always prefer a media-server. Because most of the times 8 or more universes are used.
  • stagelitesstagelites Registered User, Hog Beta
    edited February 2014
    I think that pixel mapping is necessary. On large scale projects a media server is great. However on the MA I can map a graphic to PARS, strobes, movers. Color and intensity is mappable. This makes for some very easy and dramatic effects.

    I am a total hog fan. I think that not having this in their software on a desk of this calibre is a real mistake.

    Much as I hate to admit it, my friends running their MA's can smoke me on certain jobs, simply because that feature exists.

    Being able to assign mapping to anything makes some jobs ridiculously easy. I think that this is something that needs to be addressed and added. Without it, I think that not having this feature when other desks costing less have a wonderful implementation of it is a serious oversight.
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    I agree that is missing.
    But really I wanted to know what you would like to do?
    Run Movies? Playback a bitmap and animate it?
    Array size? 10x10 Pixel? 10 x 20 Pixel? 100 x 100 Pixel?
    In which scenarios are you using console internal pixelmapping
  • stagelitesstagelites Registered User, Hog Beta
    edited February 2014
    If I need to run movies or full video then at that point, I would use media servers. However I think that the ability to map a small movie (i.e. a flag waving) onto a set of LED fixtures would be great.

    A Bitmap would be ideal. 100x100 or 320x320. Or 16x16. I think that any would be better than what we have currently.

    I would use bitmaps for intensity chases, color chases, simple LED walls. Color effects and strobe effects.

    I love the Hog. I have spent a lot of my hard earned money using them and buying them. I also have avoided the MA a lot, simply because I liked my hog. However, I cannot ignore that the MA and Chamsys can do some things that the Hog cannot.

    I am seeing more and more designers request the MA. I think it is simply because it has some key features that the hog console is missing. When I talk to ops and other designers, they all love the hog, they simply lament the missing features. A number of them have told me that they moved over because of the lack of these features and support.

    I just replaced my Hog 3 with a Fullboar 4. I debated on buying a full Hog 4. It was a tough decision, I loved the desk. I decided not to because I felt that software wise the Hog 4 was not a great deal. For what it cost it was lacking a number of key features that another desk has for the same price.

    Pixel Mapping, is a necessity in todays lighting world. It makes LED programming simpler, it allows complex effects.

    I kind of look at this as a Mac versus PC argument. (I am a mac guy BTW) I hear my friends who love Hog tout it. My MA friends tout the MA. While I think that the Hog has a much friendlier interface, I cannot ignore some of the wonderful features that the MA has.

    I will use whatever system will do the job best. Sadly I have had to pick the MA on a number of jobs because of what it does and the hog does not.
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Ok...
    100x100 pixel RGB for example....
    Thats 30.000 DMX Channels... Guys keep it serious... Have you ever done this?
    59 Universes....
    How high needs to be the resolution or at least the distances for the viewer to see a waving flag as it what it is meant to be...
    Just thinking of the 50 stars of the flag of the USA...

    So thats why i was asking what are you really using? What you need pixel-mapping for in a console?
    I agree with small bitmaps like letters and numbers, stars etc to use with Jarags etc
    But for the the rest... See my example above... This means a lot of processing and also external hardware, 100x100 pixels matrix would need 4 DP8K
    For the same money I can get a mediaserver that can do more.
  • stagelitesstagelites Registered User, Hog Beta
    edited February 2014
    I am aware how many DMX channels that would be. Not intending that in the post. What I mean is the ability to use images of that size. How you map the matrix would be smaller.
    I agree with the media server. For some show needs a built in mapping ability would be very useful.

    I have seen mapping done in a 4 universe show that was quite nice.
  • Buzz313thBuzz313th Registered User
    edited February 2014
    I would rather have HES spend their time and effort on developing a richer set of control features than adding a Bell and Whistle just because other consoles have it.

    Just my humble opinion...

    JB
  • stagelitesstagelites Registered User, Hog Beta
    edited February 2014
    I agree on a richer control set. I feel that this is part of a control set. I have seen how useful it is, and even at a minimal level would be much used.
  • Marty PostmaMarty Postma Registered User
    edited February 2014
    I think it is a much needed feature as well.
    As far as I understand it, the new H4 desks have so much overhead in their processor that the desks could easily handle this.

    There are some distinct advantages to pixel-mapping directly off the desk.
    For instance if I want to play back media across an array of fixtures and then suddenly have all or even just some of them go to a solid color....it requires additional layers on the media server, potentially with custom masks, transparencies and/or alpha channels.
    With direct control off the desk I could have all the fixtures "playing back" a video clip and then use the outside fixtures in a solid color to frame it for example all very quickly and easily without having to resort to tricks off a media server or a DMX merge between a server and a desk.
    It makes it much cleaner and quicker to just do it all off the desk.

    I know it is a feature that has been discussed and is planned, just no idea when they will actually be able to implement it.

    Hope this helps. :)
  • Buzz313thBuzz313th Registered User
    edited February 2014
    As per the specs on the HES website, the H4 is on an Intel I5, the FB on an Intel I3, both I believe are Dual Core, not quads. I have two Arkaos MMP servers that are running overclocked dedicated I7 quads with matched Boards, 16gigs of fast ram and a top of the line AMD firepro card and I have no problem choking those machines if I try, especially when I'm sending to a pixelmap instead of just a video surface . Sure, the H4 will never be a replacement for a Media Server, but I don't think you realize the amount of processing at the CPU, that's required to simply decode a compressed video file and then convert it to DMX data for a mapped surface. Sure, the H4 could do it, but either way you look at it, your gonna be taking from peter to pay paul and the control response from the desk is gonna suffer. It may be unnoticeable, or completely noticeable. In the end, unless your running a very low resolution Pixelmapped Effect, your gonna end up using a media server anyway.

    Doing an LTP, HTP or switchable Artnet or DMX merger at a node, IMHO, is actually as easy and intuitive as it would be if you were running the media and a fixture patch out of the console.

    Pixelmapping from the desk is indeed a cool feature, but if it gets in the way of the control desk doing it's primary job, then it's more of a hindrance.

    Another thing to consider... The use of LED's as a image display device is becoming more popular and more affordable. The displays are getting bigger and of smaller pitch, which basically means more pixels overall. The big LED rigs of yesterday will be the small rigs of tomorrow.

    With that being said, if HES does implement a video driven pixel mapper on the H4 without sacrifice, then better still.

    JB
  • Marty PostmaMarty Postma Registered User
    edited February 2014
    Buzz313th wrote: »
    As per the specs on the HES website, the H4 is on an Intel I5, the FB on an Intel I3, both I believe are Dual Core, not quads.

    Where do you see this info? I could not find it. As far as I know the H4, FB4, & RH4 all use the same AMD processor. Slightly different motherboards between the models, but the same processor on all of them, but maybe I am thinking of the onboard DP-8000 processor on all of them.
    Buzz313th wrote: »
    I have two Arkaos MMP servers that are running overclocked dedicated I7 quads with matched Boards, 16gigs of fast ram and a top of the line AMD firepro card and I have no problem choking those machines if I try, especially when I'm sending to a pixelmap instead of just a video surface . Sure, the H4 will never be a replacement for a Media Server, but I don't think you realize the amount of processing at the CPU, that's required to simply decode a compressed video file and then convert it to DMX data for a mapped surface. Sure, the H4 could do it, but either way you look at it, your gonna be taking from peter to pay paul and the control response from the desk is gonna suffer. It may be unnoticeable, or completely noticeable. In the end, unless your running a very low resolution Pixelmapped Effect, your gonna end up using a media server anyway.

    Arkaos isn't really any kind of an accurate benchmark for anything Hog related.....apples to oranges....if you want to go that way I have run Ivy Bridge i7 AXON servers with 9 layers of full HD (1920x1080p) and all 32 FX running crossfades as well as visual FX running crossfades, etc.....the CPU never got above 30% or so...GFX and HDD were all performing well this way too....in short the severs were not even breathing hard....a Windows or Mac computer is burdened with all kinds of things a dedicated OS system like Hog or AXON doesn't have to worry about. Will it take system resources to run video files? Of course it will, but with the current hardware being what it is I highly doubt we would see any noticeable lag in performance.

    I do agree that any type of truly "heavy duty" pixel mapping should be done via media server or dedicated product interface/processor.
    Buzz313th wrote: »
    Doing an LTP, HTP or switchable Artnet or DMX merger at a node, IMHO, is actually as easy and intuitive as it would be if you were running the media and a fixture patch out of the console.

    I disagree with you there too. More pieces of gear in the system mean more potential points of failure. It would be far more elegant and streamlined to have all of this contained within the desk via software.
    Buzz313th wrote: »
    Pixelmapping from the desk is indeed a cool feature, but if it gets in the way of the control desk doing it's primary job, then it's more of a hindrance.

    Agreed, but as I stated above I highly doubt this will be an issue given the current hardware spec on the desks.

    Hope this helps. :)
  • Buzz313thBuzz313th Registered User
    edited February 2014
    Hey Marty, Processor info on 1st page under Hardware.

    H4.. http://www.highend.com/pdfs/HOG_4_Console_ArchSpec.pdf

    FB4.. http://www.highend.com/pdfs/Full_Boar_4_Console_ArchSpec.pdf

    RH4.. http://www.highend.com/pdfs/Road_Hog_4_Console_ArchSpec.pdf


    As for the onboard DP8000... Correct me if I am wrong, but I do believe they are sharing the one and only processor on the desk. As far as I know, or assume, the onboard DP's don't have their own CPU like the external DP's do. But I really don't know, as I can't find any confirmation documented anywhere.


    In regards to the Arkaos Vs Axon..
    Take your Axon and try scratching the H264 Content. Anything not playing back forewords and your gonna peg your processor. Feed it 32bit apple pro res files and try running just 4 1080P layers. You will probably see skipping and dropped frames. Most media servers require a certain list of file types and encoding or media compression. This was done so the manufacturer can control the types of files that you are allowed to play back based on what they designed their system to do. This means that sometimes you have to re-render content to make it compatible. Arkaos is one of the exceptions, as it will take almost any file type and compression. Its up to you as the programmer to choose the best file type for the type of playback you intend on using.

    I can run 12 layers, with piled on realtime rendering effects, on 6 outputs of 1080P foreword compressed with H264 no problem.. But as soon as I try to run one H264 layer on one output backwards or scratch the content then I choke the system and it pegs the processor at 100% load. Or if I take 3 simple files 30 seconds long encoded with Apple Pro Rez at 32bit color and try and run just 4 layers regardless of output count it chokes the system. It's not just layer count, output count, or what realtime rendering your doing, since that's all being processed at the GPU... it has mostly to do with how much decoding the CPU has to do and how you intend on playing it back. It's the decoding of a compressed file that will hammer the CPU. My Arkaos Boxes are dedicated for just media work, the Bios have been tweaked, the OS has been tweaked and the only processes running are the only processes needed for the dedicated media work. It's as good if not better than your Axon or the Arkaos dedicated stadium server. Don't kid yourself, most if not all the media servers are running a stripped down version of Win or Mac OS, WITH ONLY THE NECESSARY PROCESSES RUNNING. The media server manufacturers offer the servers up as turn key solutions for someone who doesn't want to do their own system configuration. Its still nothing more than a Windows or Mac box running software. But more importantly, it's a separate system, designed specifically for manipulating and outputting media in realtime. Personally, I would like to see the Hog play to its strong points and not get labeled as a "Jack of all trades and a winner of none".

    It's a good forum here and no reason to agree, if you disagree, so I respect your opinions.. Sure, more gear means more points of failure... But I don't feel comfortable adding extra load and tasks to the most important piece of hardware in the system...which is the control desk. Especially when I can use a dedicated media server.

    Good conversation... All very valid points.

    JB
  • Ben_TaylorBen_Taylor Registered User
    edited February 2014
    I see the point above but if a Chamsys can handle basic Pixel mapping saying a Hog can't, we may as well go home now.
    Not to mention Avo, Ma and several others......
    Its a basic tool that a lot of people expect if we don't have any form of it then it takes us out the running. just my two cents.
  • Buzz313thBuzz313th Registered User
    edited February 2014
    I'm not saying a Hog can't handle it. I guess what I'm saying, is that IMHO, Pixelmapping and running media from a console will always have its limitations when compared to a dedicated media server. So why bother doing it half assed. Just get the correct tool for the job.

    You bid a job with just a desk telling the client you can send the LEDs a media file. Then the client sees what you can do with a small array of pixels and wants and asks for more. If your at the limit of your bitmap fx from the desk, then what?
  • Ben_TaylorBen_Taylor Registered User
    edited February 2014
    If you can afford a media server that's great but not every job can or if all your controlling is 10 led battens hung from a rig grouped into a 10x10 pixel square and you want to run a small effect across them, its a pain, whereas with even Avo (and i hate saying that) its really easy to get something running on it (and I'm talking a star or something that the effect engine can't do) (and doesn't get into media server world)
    Do we know what the limits of pixel mapping is on things like Avo and chamsys?
    If Hog could do everything Chamsys does and more (like we are pushing into) then every chamsys user, would naturally progress onto the better feature set of Hog.
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    I think the main problem here is what is meant by each of us when we say "Pixel-Mapping"
    Thats why I was asking in which situations and which kind of arrays we are tlaking about.
    a 10x10 LED Battens is something small.
    But the problem is what JB mentioned: the limits.
    I dont know how AVO or MA or Chamsys handles this. From GMA1 I know that had only small bitmaps. LSC Clarity can play movies...
    I think what most people want to see is an easy bitmap animator. And not a an option to playback Full HD Videos. Together with that an more advanced selection tool like MATricks.
  • Ben_TaylorBen_Taylor Registered User
    edited February 2014
  • Marty PostmaMarty Postma Registered User
    edited February 2014
    Buzz313th wrote: »

    Cool....didn't even think to look there,
    Buzz313th wrote: »
    Take your Axon and try scratching the H264 Content. Anything not playing back forewords and your gonna peg your processor. Feed it 32bit apple pro res files and try running just 4 1080P layers. You will probably see skipping and dropped frames.

    Yes scrubbing through frames does make it work a bit harder, but not excessively....probably b/c AXON uses only VERY SPECIFICALLY encoded MPEG-2 content files.....H264 and Apple Pro Res, etc won't work with AXON.
    Buzz313th wrote: »
    Most media servers require a certain list of file types and encoding or media compression. This was done so the manufacturer can control the types of files that you are allowed to play back based on what they designed their system to do. This means that sometimes you have to re-render content to make it compatible. Arkaos is one of the exceptions, as it will take almost any file type and compression. Its up to you as the programmer to choose the best file type for the type of playback you intend on using.

    Much like Hog....I'm sure once pixel mapping is enabled, then the video file type will need to be very closely controlled in order to keep system resources from being overly taxed. I would hope/expect some type of import file screening and conversion to be available directly on the desk to help facilitate that. Perhaps there should also be a "meter" of some sort that shows us exactly how hard the system is working so we can make informed decisions as we program as to how far we are willing to push it.
    Buzz313th wrote: »
    My Arkaos Boxes are dedicated for just media work, the Bios have been tweaked, the OS has been tweaked and the only processes running are the only processes needed for the dedicated media work. It's as good if not better than your Axon or the Arkaos dedicated stadium server. Don't kid yourself, most if not all the media servers are running a stripped down version of Win or Mac OS, WITH ONLY THE NECESSARY PROCESSES RUNNING. The media server manufacturers offer the servers up as turn key solutions for someone who doesn't want to do their own system configuration. Its still nothing more than a Windows or Mac box running software.

    That is all well and good, but unless you are developing your own version of say XPe or Win7 embedded....the "standard" OS can only be ratcheted back so far....not nearly as far as a custom embedded OS verision
    Buzz313th wrote: »
    .....a separate system, designed specifically for manipulating and outputting media in realtime. Personally, I would like to see the Hog play to its strong points and not get labeled as a "Jack of all trades and a winner of none".

    And for "heavy duty" or large scale use I would agree a server is a much better option. I also agree that when it is done it has to be done right, and not just as a gimmick or marketing ploy.
    Buzz313th wrote: »
    It's a good forum here and no reason to agree, if you disagree, so I respect your opinions.. Sure, more gear means more points of failure... But I don't feel comfortable adding extra load and tasks to the most important piece of hardware in the system...which is the control desk. Especially when I can use a dedicated media server.

    Good conversation... All very valid points.

    I agree. I enjoy these discussions with other intelligent people here regardless of their point of view....one of the many reasons you won't find me on some of the other forums where people just want to take a piss on one another all day long....I've got no time for that.
  • Marty PostmaMarty Postma Registered User
    edited February 2014
    MLorenz wrote: »
    I think what most people want to see is an easy bitmap animator. And not a an option to playback Full HD Videos. Together with that an more advanced selection tool like MATricks.

    I wouldn't say a bitmap animator takes it far enough....just a still image that then moves around is OK, but not nearly dynamic enough. A low-res video file would offer much better options, or some sort of onboard file conversion that can take a higher res file and sample pixel areas to take it down to a lower res.
  • Buzz313thBuzz313th Registered User
    edited February 2014
    Cool post Marty, Thanks.

    If I need to scrub files or go backwards on Arkaos, I found that "PhotoJpg" files rerendered with a key frame every frame or "Anim" file type will give me the ability to playback any direction or scrub (Scratch) with very little CPU overhead. I have been able to scrub 8 layers off faders and have CPU and GPU FPS stay we'll above 300. Those file types allow for almost an uncompressed file. It's a huge file compared to other types, but more importantly it doesn't need to do a lot of decoding, so it leaves your CPU alone. The bottleneck in this case is the Ram and then the Ssd. Both of which are fat enough for the task.

    If your Axon will eat those file types, give it a spin.

    All the best

    JB
  • Buzz313thBuzz313th Registered User
    edited February 2014
    In regards to the OS... I'm on win8.1 now for the servers and not only has it improved rendering performance over win7, but Win8.1 is extremely customizable. I believe I trimmed it down to just the bare bones necessary for Arkaos.

    JB
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    I wouldn't say a bitmap animator takes it far enough....just a still image that then moves around is OK, but not nearly dynamic enough. A low-res video file would offer much better options, or some sort of onboard file conversion that can take a higher res file and sample pixel areas to take it down to a lower res.

    Hey Marty,

    yes and no...
    I agree that it would offer much more options.
    I would love to see something like in Clarity.
    But no because of all the reasons you also mentioned: which files, which codec, processor load etc etc
    JB also mentioned one thing: when a problem occurs the whole system might be affected...
    And when I see the answers to my question which array size etc the people are using a animated bmp would be good for 90% or more, even if we had the chance to play real video it would be better.
    I think one of the main-differences to consoles like Clarity and Chamsys is that the HOG OS distributes the calculation to specific nodes and doesnt do all the DMX calculation in the console. Once a cue is programmed it is "stored" in the DP and the console only starts this cue and let all calculations be done by the DP.
    So that would mean all movies etc must be first transfered to the DP and also stored their before they could be played back. This would mean Gbit Network as a must etc... Also something that needs to be considered
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Buzz313th wrote: »
    Cool post Marty, Thanks.

    If I need to scrub files or go backwards on Arkaos, I found that "PhotoJpg" files rerendered with a key frame every frame or "Anim" file type will give me the ability to playback any direction or scrub (Scratch) with very little CPU overhead. I have been able to scrub 8 layers off faders and have CPU and GPU FPS stay we'll above 300. Those file types allow for almost an uncompressed file. It's a huge file compared to other types, but more importantly it doesn't need to do a lot of decoding, so it leaves your CPU alone. The bottleneck in this case is the Ram and then the Ssd. Both of which are fat enough for the task.

    If your Axon will eat those file types, give it a spin.

    All the best

    JB

    These file-types are always best if you like to playback a movie backwards. Keyframe every frame is the key to success ;-)
    I use Catalyst a lot and the CPU is the secondary thing there, like in most Media-Servers. GPU and fast datatransfer from you storage is the main-thing to take care about. I can playback 8 layers of Full HD ProRes 422 on a Mac Pro Quad Core 5.1 with a Quadro 4000 in it.
  • Buzz313thBuzz313th Registered User
    edited February 2014
    Marc,

    So with the quarto 4k, you can send 4 1080p outs, right?

    JB
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Hey JB,

    as the Quadro 4000 only has 2 Outputs without any additional hardware just 2.
    But with a Datapath x4 connected or 2 dual/triple heads it is possible

    Even better are the GTX680 or HD7950. They both have 4 Outputs and Catalyst will soon support 4 direct outputs.
    With these cards also 4K output is possible
    Will do some tests with the new MacPro as soon as there is a Mavericks version of Catalyst
  • Buzz313thBuzz313th Registered User
    edited February 2014
    I ran the 7970 for a while, but just upgraded both boxes to AMD W9000s. They are monster cards. 6 outs at 4k each, plus I can Genlock (Framelock) at the cards which is awesome since I don't have to do an ImagePro. They are well worth the money and transforms the box into a good investment.
  • dignedigne Registered User
    edited February 2014
    hello,
    I'm quite divided on the pixel mapping directly into the console. This would require a lot of crossfade on all DMX channels and suddenly the console performance would suffer . So I think the media server solution is better.
    But imagine a pixel mapping on a 10 x 10 magicPanel wall ( 60x60 pixels).
    If I want to run a media on the entire map then the media server works well.
    But now imagine that we want only turn on the first and last LED of each MagicPanel ... I think it becomes a little complicated with a media server
    The easiest, in this case, is that the console directly turn on those LEDs. But unfortunately this is the media server that is wired to the magic panels ....
    Should therefore merge the media server AND the console...
    So in the end we need 1 media server + 1 merger htp.

    So I think it would be more interesting than the hog can be a REAL merger htp that can make a pixel mapping alone.
    This way we can run all kinds of media but also easily controlling each pixel.

    Regards,

    Gael
  • Buzz313thBuzz313th Registered User
    edited February 2014
    ^^

    +1

    JB
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Buzz313th wrote: »
    I ran the 7970 for a while, but just upgraded both boxes to AMD W9000s. They are monster cards. 6 outs at 4k each, plus I can Genlock (Framelock) at the cards which is awesome since I don't have to do an ImagePro. They are well worth the money and transforms the box into a good investment.

    With Apple and Catalyst I´m a bit limited which cards can be used.
    But the cards in the new MacPro are quite comparable to the W9000 (unfortunatly without Genlock)
    Looking forward to the new Axon Pro HD Live with Genlock... I think they might use also this Gfx
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    digne wrote: »
    hello,
    I'm quite divided on the pixel mapping directly into the console. This would require a lot of crossfade on all DMX channels and suddenly the console performance would suffer . So I think the media server solution is better.
    But imagine a pixel mapping on a 10 x 10 magicPanel wall ( 60x60 pixels).
    If I want to run a media on the entire map then the media server works well.
    But now imagine that we want only turn on the first and last LED of each MagicPanel ... I think it becomes a little complicated with a media server
    The easiest, in this case, is that the console directly turn on those LEDs. But unfortunately this is the media server that is wired to the magic panels ....
    Should therefore merge the media server AND the console...
    So in the end we need 1 media server + 1 merger htp.

    So I think it would be more interesting than the hog can be a REAL merger htp that can make a pixel mapping alone.
    This way we can run all kinds of media but also easily controlling each pixel.

    Regards,

    Gael

    Very good point.
    Artnet merging should have more options, like LTP, HTP per channel
  • Buzz313thBuzz313th Registered User
    edited February 2014
    I know HES was testing a few of the Firepro Cards, but not sure which one ended up in the Axon HD.

    JB
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    To come back to the topic...
    When will be Pixelmapping in an Axon ;-)
  • shandy666shandy666 Registered User
    edited February 2014
    All Said we need some form of pixel mapping !! chamsys ops in film use pixel mapping instead of effects to do chases !! at festivals you get a wall of jarags and a small pixel effect is a quick way of getting good visuals !! if chamsys and Avo can do this over artnet surely H4 ( which has shit loads more power) is capable !! Having to change desk to chamsys because H4 Cant is not the best way FORWARD !! Can we please move forward with a little more haste !
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Andy, sometimes more power and more options also results in a more difficult setup and realisation.
    Chamsys and Avo are outputting artnet just from the console. So only one physical output and processor taking care of all.
    HOG can output from the console and also via DPs, so a distributed system where the calculations for DMX are done. This is a more complex task than in the chamsys or avo world.
    I agree that it is not impossible and should be done. But it is more complex to implement. With that said I heard rumors that it is worked on.
  • srautanesrautane Registered User, Hog Beta
    edited February 2014
    Personally, I am more interested in taking things to the next level.
    We should focus more on what we want from pixel mapping and what kind of pixel mapping is something new. Is there going to be pixel mapping or not is kind of irrelevant discussion.
    What we should think is what kind of pixel mapping is best for our needs and what is a good integration with FX engine.

    IMO it is important.

    -To have a pixel mapping that is easy to use
    -Good crossfades between FX engine, and between bitmaps.
    -Here's a good place for multitouch moving, scaling and rotation.
    -Still image animation (for example move, scale, rotate, intensity)
    -Ability to pixel map almost any parameters (pan and tilt, iris, zoom, etc)
  • shandy666shandy666 Registered User
    edited February 2014
    Thankyou Mark for your usual mass of information you share on this site !! it always helps me understand the complexities more !!
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    srautane wrote: »
    Personally, I am more interested in taking things to the next level.
    ...

    IMO it is important.

    -To have a pixel mapping that is easy to use
    -Good crossfades between FX engine, and between bitmaps.
    -Here's a good place for multitouch moving, scaling and rotation.
    -Still image animation (for example move, scale, rotate, intensity)
    -Ability to pixel map almost any parameters (pan and tilt, iris, zoom, etc)

    Good point Sami, the option to map the value of a pixel to any parameter would be a great tool for creating very complex looks.
    All of your requests is what I would request from a good pixel-mapper, and thats also a reason why I´m using a media-server for that task most of the times.
    I think it makes no sense just to implement a small bmp-animator, if it is done it must be done correct and as a minimum with the same options a Chamsys, AVO or Clarity offers. If you want to seperate from the competitors you need to provide some stuff that the others have and make it better/easier to use or you must provide more features
  • rosswillrosswill Registered User, Hog Beta
    edited February 2014
    I have been reading with interest. Personally I think the pixel mapping thing is a reaction to the inability to do what we want to do on the console, and is a crude workaround, in most cases. What's missing for me on this and some other consoles currently, is the ability to layout your fixtures graphically yet proportionately, be there 10 or 10,000 of them, and be able to create the desired looks on stage quickly, accurately and in a manor that is both reusable and scaleable. By this I mean you should be able to create a sequence on one show with 100 fixtures and re-use this on another show with 500 fixtures. You should be able to alter the relationship of the fixtures in the layout to create various offsets without reprogramming your data.

    The main issue I have with pixel mapping, and yes sure it does have a place of sorts, is that unless you are creating pixel accurate bespoke content, it easily becomes generic wall paper, is very hard to manipulate accurately, and almost impossible to use with music which has a non-linear or syncopated structure. If it is bespoke, altering the fixture count requires remaking the content.

    I feel as if what is being requested can be achieved by other means, it just takes a shift in thinking perhaps to get there. The beauty of coming at this from a different angle is that it's not fixated to any number of parameters or fixture types and thus should offer support for future fixture releases of any kind however off the wall they become as they come to market.

    I'm very excited about the future plans of the development team and quietly confident a good all-round solution can be found. I'm just not sure old-style pixel mapping is the whole answer.

    Regards

    Ross
  • MLorenzMLorenz Registered User, Hog Beta
    edited February 2014
    Ross, very good comment!!!
    It should for sure be taken to the next step like a said in another post, because just make a copy what the others have is almost like having it not at all.
    The reusability of programmed efx, patterns or whatever is really taken it to the next level
  • frank_schotmanfrank_schotman Registered User, Hog Beta
    edited February 2014
    Same here I agree with Ross, Yes we need to be able to do what the others are doing...... but i think when we come with some kinda layout / pixelmapping tool we should not get aside our competitors but jump over them :09:
  • SharpsynthSharpsynth Registered User
    edited March 2014
    I completely agree on the need to implement pixel mapping within the console.
    Versatubes, Chauvet Color rails and Epix, even high ends solawash fixtures have pixel control of there fixtures. As someone who does a lot of high level busking at festival work the 'pixel mapping' effect ie. the organic flow that can occur over mapped fixtures is something that's in high demand. Yes if you're going to go about a super complex multi universe installation then Madrix or any media server implementation is better. But for everything else its a necessity.
    As for computing power, ill drag the history train back further then the current examples of chamsys and resurrect the old LED-trix that was a part of lightjockey as an example that computing power is not the problem here.
  • MLorenzMLorenz Registered User, Hog Beta
    edited March 2014
    Computing power is never been the problem
    Distributed systems are just a bit more complex to handle compared to a single artnet output
  • stagelitesstagelites Registered User, Hog Beta
    edited April 2014
    MA lighting has been able to implement it using their nodes. I do not think that it would be a problem on the DP8000.

    Pixel mapping, effects engines, linear cues, loops. All of them are tools to get the result we want. Each one has its own merits and disadvantages. Personally the more tools the better.
  • thekid2112thekid2112 Registered User
    edited June 2014
    I will also appreciate some kind of bitmap effects engine out of the board.
  • Playdoe9Playdoe9 Registered User
  • stagelitesstagelites Registered User, Hog Beta
    Exactly. This has been too long in coming. Looking forward to it
  • stagelitesstagelites Registered User, Hog Beta
    Now they just need to get into our hands
  • MLorenzMLorenz Registered User, Hog Beta
    During LDI it will be shown.
    Beta should then be available soon after LDI I guess. Some more to come
Sign In or Register to comment.