The original sketch that I wrote was successful in terms of creating a code that detected faces and blurred them out consequently of this. However, the blurring was slightly askew, creating this image of a distorted face on top of another, if timing was short I would have settled with this outcome but I had enough time to review over where I was going wrong and correct the issue.
However, reviewing the code I could not find an exact reason as to why the sketch was not working, but I believe it was down to the facial tracking area section of code. Fortunately, this proved to not be that much of an obstacle because I know that it was a simple of moving the code over to an example code of ‘LiveCamTest’.
Here is the code for this sketch, conveying the unsuccessful facial tracking, but unsuccessful facial blur:
From a personal opinion, I think that social media heavily influences a large amount of people online and how they interact within the virtual space. I know people that in real life are extremely shy but on social media, convey themselves as confident and almost arrogant. In my opinion, this is a good example of how social media influences self representation of identity and because of this, I looked into articles on the correlation between social media and technology and its influences on the latter, a perfect example of what I am trying to portray is this quote from Green (2013) stating that “the fact that on social media sites, we consider our profiles to be presentations of who we are. Therefore, through interaction with the social medium, the real and ideal selves intersect; and the ideal self is at least partially actualized. In essence, our online selves represent our ideals and eliminate many of our other real components.”, almost as if identity on social media is starting to over bear ‘real life’ identity.
Green, R., 2013. The Social Media Effect: Are You Really Who You Portray Online?. [online] The Huffington Post. Available from: http://www.huffingtonpost.com/r-kay-green/the-social-media-effect-a_b_3721029.html [Accessed 8 Jan. 2015].
When researching my idea, I wanted to find a good theme and a strong reasoning behind the idea of blurred faces, this is where the idea of identity became apparent to use as reinforcing the idea. It is apparent in recent years that technology has increasingly influenced how the world perceives identity or at least how people self identity. It almost absorbs peoples individuality as technology removes, a relevant quote (Taylor Ph. D. 2011) states that society feels “..compelled to promote and market these identities through social media. The line between person and persona, private and public self become blurred or erased completely and the so-called self-identity becomes a means of our acceptance and status.”
Another extreme use of how identity is being altered by technology is through gaming where people spend more time developing their online character’s profile and some times get into the ‘role’ of the character, (Ghosh 2013).
Taylor Ph. D., J., 2011. Technology: Is Technology Stealing Our (Self) Identities?. [online] Psychology Today. Available from: https://www.psychologytoday.com/blog/the-power-prime/201107/technology-is-technology-stealing-our-self-identities [Accessed 12 Jan. 2015].
Ghosh, P., 2013. Web ‘re-defining’ human identities. [online] BBC News. Available from: http://www.bbc.co.uk/news/technology-21084945 [Accessed 12 Jan. 2015].
An idea that I want to explore is identity, using the ‘LiveCamTest’ tool (that tracks faces) and then create a sketch that removes the participants identity, the best way of conveying this, is by removing their face from the screen. I want my interactive piece to make the user question whether or not technology removes identity of the user by literally have technology removing the identity of the audience member, the other question adding to this is whether or not that it is actually either an improvement or a worrying rising culture.
I hope this idea is somewhat strong enough to convey a point but through simple enough to understand what is going on. To gain a better understanding, I am going to research about how technology correlates and influences society’s self image and how its changing.
My original idea, was to include sound merged with a video so that movement would create a sound so to get some inspiration, I researched into the correlation between motion and sound and tried to find videos of software or exhibitions in which it has already been done, and for it to be done successfully too.
An interesting video I found was a piece of software/hardware called ‘Motion Synth’ which is an external device that you connect to an iPhone or Android, this is something I would have liked to have created, but I suspect the software relies on the smartphone’s gyroscopic capabilities to create the synth effect, you can see the user rotating their hands in a spherical motion as well as back and forth.
A much more basic version of this is the theremin, I thought about whether or not there would be a possibility to create the same set up as a theremin where two hands control what sounds are created. To do this I am going to import the ‘minim’ library that includes examples that use sound, which I can try to research for a more developed idea.
I was advised by my lecturers to import different contributed libraries into Processing if I were to explore different types of sketches that the programme is capable of. Because I wanted to use the camera as part of my interactive pieces, I imported ‘OpenCV’ as it contained good examples of camera sketches, such as Facial Detection, Live Cam Test and Find Edges.
Open CV is described on the website as: “..an open source computer vision and machine learning software library. OpenCV was built to provide a common infrastructure for computer vision applications and to accelerate the use of machine perception in the commercial products. Being a BSD-licensed product, OpenCV makes it easy for businesses to utilize and modify the code.”– (OpenCV, 2015).
What is good about using contributed libraries is how you can just write a simple code, like “import gab.opencv.*;” and it works accordingly to the code written, I definitely will use this for my final interactive piece in Weymouth House.
At the end of the workshop, we was given a small task to write the code for a working clock and date in real time. After some research, I used the string functions to include all variations of time in a small grey box, shown in the image below is the code and the outcome of it.