Jason Matthew Lee

October 26 - November 23
    • In information theory, entropy is the average amount of information contained in each message received, wherein message stands for an event, sample or character drawn from a distribution or data stream. Additionally, entropy characterizes the uncertainty about the source of information.
      Jason Matthew Lee uses analog and digital source materials that mark a specific event or occurrence. As information is iterated into the works, layering and density obscures the relationship between output and source.