It is undeniably true, that human language conceives limitless possibilities of metaphors. This could let us manipulate metadata using poetry, hypothetically, letting Google search results for “eyes” to return to images of stars, in addition to pupils.

A few weeks ago, in a lecture series “Poetry for Robots,” inspired by the surrealist Argentine writer, Jorge Luis Borges, in partnership between Neologic, Webvisions and The Center for Science and the Imagination at Arizona State University, launched a unique project that touched upon the importance of metaphors and its limitless possibilities. The unique project tried to put Borges’ hypothesis to test, raising a question on their site where it is a conceivable idea – teaching machines the idyllic nature of the language used by humans.

Corey Pressman, a partner at Neologic, a digital agency that assists clients assemble sites, applications, and other digital experiences believes that as a species, we often think about the world in a beautiful metaphoric-poetic way.

The Poetry for Robots test has two sections. The principal is the information section side, which was released a week ago. The group is hoping to crowdsource poetic verses in response to a set of 120 vivid images on the site to put to use as metadata for its vast image database. Users are requested to input beautiful ballads up to around 150 characters which reciprocate to a specific picture – a beautiful flower, or a glowing candle in the dark, a girl holding a balloon. The group will be accepting responses, ballads of course, all through the mid-year, compiling the magnificent verse into an extensive collection of metadata for their image database. They plan to unveil their result at Webvisions Chicago in September, where hope to demonstrate that Borges' hypothesis was right, adding to an ideal model change from machine learning which is scientific, bland and quite literal to playful, poetic, and human.

The Poetry for Robots project can possibly change the way we interact with machines. Presently, to search image databases, people need "think like machines", for example, one needs to search input “stars” when looking for images of stars and not eyes. Despite the viability of this utilitarian method, it starts to separate at the limit of the human, where more abstract experiences of life such as sadness and beauty are hard to evaluate as picture metadata.

The second use of the Poetry for Robots project depends on how the metadata thing plays out. If the first phase goes successful, the ability for robots to search like a human could likewise prompt the ability for robots to compose verses also.

While human creations have regularly been seen as a safe shelter from the robot takeover, PCs have as of now effectively exhibited that they are fit for taking part in creative endeavors. Machines have showcased their poetry writing ability in literary magazines, and various bots source tweets as a fodder for poetry they create on Twitter.

Corey Pressman, who is also an artist, is the first to concede that there is something disconcerting about robots making creating art. However notwithstanding the uncanniness of creative machines, he refers to Brian Eno and Peter Chilver's generative music application Bloom as an enlightening illustration of the excellence that can at present be found in machine produced art.

The Poetry for Robots project could radically enhance this moderately simple field of creative machine learning by enabling PCs to better comprehend the embodiment of allegorical dialect, which if Borges is correct, is a large part of what makes us human. As neo-Luddites of all sorts regret the apparent mechanization of humankind, Poetry for Robots is endeavoring to turn around this pattern by humanizing the machines.