Now, focusing on artificial intelligence, AlphaGo can accompany you to Go, Google Translate allows you to communicate with friends from different countries, but do you think it seems that something is missing? These cold ice AI products seem to be performing mechanical and instinctive movements. Can we go further – giving artificial intelligence something more? For example, emotional intelligence. . . . . . .
Someone told me to be prepared: one day, artificial intelligence will replace my work. This will make me poor, without roots drifting or in the fear of endless time and survival, and whether it depends on who you ask. It seems that it is time to think about what type of work is only humans can do, and then frantically adjust themselves to adapt themselves to these roles - so as not to be helpless, as if at the end of some robotic music chair games. . . . . . .
In the future prediction of automation, there is almost no such work form of emotional labor, perhaps because it is intangible, difficult to quantify and monetize. To a large extent, efforts that include caring, shouldering the heavy responsibilities, and taking the responsibility and well-being of others as their own responsibility are neglected like many “female jobs,†although in recent years, discussions about hidden costs have resulted in labor inequality. The momentum in the dialogue was gained.
Benefiting from the magical tools of the digital society, we are theoretically able to obtain and give more support than ever before. The social media platform allows us to deepen our understanding and maintain close contact, so we tend to think that this knowledge helps to promote resonance and connectivity. We also have a deeper understanding of structural inequalities and global humanitarian issues. But who is doing the actual teaching work?
For most people, modern technology and social media infrastructure, including myself, have not made life easier, and in fact, this has fueled the need for emotional labor, but there is no extra salary. Almost all of the work is like this, and those who do heavy work end up being the least popular. On Twitter, almost the world of women, these people who regularly provide questions about race, cross-disciplinary and political issues often have to risk harassment to speak. If you spend some time on social media and have some insights, then this is mainly due to the volunteers who are under pressure (and the lure of profits that people can use) and who provide content services free of charge.
If the timing is right, I also want to try to do this kind of work, but slowly realize that emotional labor can also be intimate. For example, women's energy to improve interpersonal relationships in different degrees is a big example. In the Facebook era, the daily challenges in the lives of friends always appear in front of me, making it difficult for me to pretend that I can’t see the appeal of one or several of them for help or support. This is mainly because of my actual work. The boundaries with life are increasingly disintegrating. In a way, I can spend some time supporting a conversation with a friend who is not really close, or sticking to myself in a network perspective and opposing the views of strangers who might never see it.
Of course, "I spend too much time on social media." This is a privileged complaint in the development of an ambitious plan. But all in all, my friends and I are increasingly worried and worried at the end of the day's work. It seems that we have become slaves to money, and our hearts are more lonely and empty. Since the 1970s, the number of women who have chosen to skip their motherhood has turned over. Although it contains all kinds of times and economic factors, I often think: If today’s women think that we are all in love, What should I do?
In the 1960s, Joseph Weizenbaum created a therapist chat room called ELIZA at the MIT Artificial Intelligence Laboratory. Although he did not intend to design a "real" AI therapist, Weizenbaum was surprised to find As AI provides warm reminders and positive responses based on the status of the “patientâ€, his secretary is gradually attracted to ELIZA and shows a love for it. Originally it was only the irony behind the illusion of emotional simulation, and now it has evolved. It has become the path of research to the human mind.
Weizenbaum did not expect that there would be so many people interested in ELIZA, who felt that they had a close relationship with her, so the Weizenbaum team intends to enter their secrets into ELIZA in the next few decades. Shown with a glowing screen. The important clues that ELIZA's unexpected likes have provided have strengthened our hopes for AI – we are eager to turn AI into an emotional labor, and we are willing to do so regardless of the level of return or not.
We have been thinking about how AI can take over these tasks, whether it tends to touch the human mind or bear the daily burden of the existing unjust society, how it does it. Robot therapists, butlers, maids, nurses, and sex dolls are all familiar parts of technology – utopian futuristic fantasy. When we are enjoying a comfortable life, we do our best to do all the things we don’t want to do. housework. However, in reality, robots based on kinetic principles can be engaged in training and caring for business, or even more, as in the service or labor industry.
In 1985 I saw my first robot toy. A teddy bear named Teddy Ruxpin, who read aloud for the children, thanks to the tape inserted into his abdomen. In TV commercials, teddy bears and left-behind children come home from school together, and their parents, probably still in the torrent of the times, travel between the buildings of the city; or he will tell stories for the children at night, Singing a lullaby, his furry chin will always walk in time. In the same year, the fourth part of the movie "Rocky" was released in the film Sylvester. Stallone, a famous boxer - certainly rich now, sent his old friend Pauli a talking robot butler in disrespect. This reached the peak of the 1980s, which means that the level of economic development is sufficient to create a staircase that leads directly to the future of technology and the world of life. The actual robot that appeared in the movie, Sico, was working to help autistic children communicate before he became addicted to Hollywood. In the movie, Pauly did not know for what reason, and he tried his best to adapt the complicated male servant into a social companion with a female voice. Later, he became more and more fond of doing this ("She loves me!" He insisted on claiming).
Perhaps for children, care, like a gentle teddy bear in overalls, can be gender-neutral. When it comes to the world of adults, we still default to women's work areas based on service and parenting. Why today's AIs use female voices or female characters so frequently, this has been the subject of research, discussion, and speculation. Some people say that we often associate service or compliance with women. On the one hand, male technology consumers often combine luxury with gender, and people generally think that women’s voices are better. "Japan's positioning of Alexa," Azuma Hikari said, is a virtual assistant, she will tell her master, when he leaves, she will miss him, can't wait to want him to go home. This kind of thing not only uncomfortably mixes gender and obedience, but also contains the accompanying, caring and daily interactions in emotional work in the digital age. We want our robots to be women, because we are eager to get emotional labor from women.
I fantasize that I am a person who focuses on disintegrating patriarchy and all these things, but when I am ridiculously eager to say "thank you" to Alexa?, and she did not respond, I still feel a little disappointed. Of course, she will only listen to me when Alexa hears me say "wake up", otherwise she may have been peeking at me. But I still feel at a loss for this kind of interaction between us, because there is no extra, lively interaction, which makes me feel that I am not forced, my needs are normal. I don't just want her to play a song or tell me the weather, I want her to think that the question I asked is still good.
This particular impulse may be detrimental to a healthy society. In an article entitled “The danger of outsourcing emotional labor to robots,†ChrisTIne Rosen cited research to warn of such artificial The way we maintain our comfort zone may evener the care vocabulary. In other words, if the robot can smile politely to respond to commands, can we stop assessing the cost of doing the same thing for humans? All outsourcing will jeopardize the depreciation of the local workforce – we may be less compassionate to look directly at our emotional intelligence degradation, or to create strange new social information about who is worth (or affordable) to care about. If our virtual assistants and emotional workers become pacifying, feminine AI, will this narrow some gaps between AI and humans? Or can they be universally recognized by society?
The key to complicating these issues is that robots, virtual assistants, productivity software, email signal checkers, data processing algorithms and anything like the Sun are now heavily cultivated under the gates of "AI", although many still Just rough algorithms or pattern matching software. Google hopes that a robot can help identify toxic Internet reviews, while Facebook is testing an AI that can identify users who may commit suicide and provide intervention options. As Ian Bogost said, when he wrote the new meaning of AI as a term, these solutions were very imperfect and easily abused, "artificial" but not particularly "smart." ".
Despite this, the potential of artificial intelligence (or software or algorithms) is still great, and the key area is online life. Portland-based creative technology developer Feel Train teamed up with the famous Black Lives Matter activist DeRay McKesson to launch a Twitter robot called @staywokebot, which aims to provide support information for black activists. Faced with some pressure from social media noise; ultimately its goal is to be the front line of the 101-level problem, such as "Why not all life is important"? Robots can already tell people how to connect with local representatives. A future goal is to provide answers to complex and common justice issues and to alleviate the active participation of activists themselves in these conversations.
In addition, content moderators face dystopian terror on platforms like Facebook, and in the 2014 Wired magazine detailed details of some particularly terrible details, which may seem like a Work that is not complicated or does not require skill. At present, the algorithm can only simply guess the mood or context of a joke, phrase or image, so human intuition is still very important. The problem is that a real person must look at every potential violation of content day after day and measure the value of each individual. Based on such considerations, an intelligent machine can at least form the first defense, and thus the human moderator may only need to study some subtle and more subtle situations.
Mitu Khandaker-Kokoris is the Chief Creative Officer of a software company based in London, England. The software company focuses on the use of artificial intelligence to develop more humane and sensible role interactions – including the game world and community management in the tough areas of the outside world. The gaming community is one of the many complex spaces people want to test boundaries, just as they want to find cultural places that make them feel safe. I introduced her to one of her company's tools, Ally, which aims to make various social platforms safer and more inclusive.
“How do we deal with the direct emotional abuse between people, how can we intervene?†For the moderator, it is very difficult for the victim to wait until the problem is solved.†Khandaker-Kokoris said .
Ally proposes to identify some potentially problematic signs of interaction—not just about speech or direct contact, but also about stalking or harassment. From there, an AI role whose parameters are determined by the owner of the environment in which the product is located will deny the behavior of the target and determine if any action is required.
This approach allows users to define their own personal boundaries, and AI can learn when and how to intervene from their interactions with users. “The boundaries are super complex,†Khandaker-Kokoris said. “We can identify certain things at certain times, not other things, which may even depend on your mood at the time. So this AI role and your interaction with them can serve as a messenger of your interactions with others in the community. I think this is an obvious example so that we can reduce the emotional burden on the victims and the moderators."
Although Khandaker-Kokoris does share the hesitation and worry that many people outsource emotional labor to automation, overall, she and I agree that the technical department needs to continue to work hard to better understand emotional labor in order to deconstruct it, or Entrust it to do something deeper. Talking to her made me feel hope, and selectively considering artificial intelligence interventions may help, making me more exhausting, more overwhelming and more demanding than ever before. In the environment – ​​especially women and people of color – better plan personal boundaries.
At the same time, the technology industry seems to continue to use the voice of women in its products, but in fact it is different from what we hear in real life, a new wave of more intelligent virtual assistants will definitely be realized in us. On the body. This is to appease us, induce and reward us, and train us in smart phones, smart homes and smart cars.
But for now, for those who are tired of online life, getting emotional intelligence from our technology is still a distant dream.
1. In memory business since 2000: we are memory experts !
2. 100% testing: we strictly test every module before shipping!
3. Quick Delivery: we always have enough inventory , so we can ship fast !.
4. Secure payment: We are ok with HSBC and accept paypal.
Memory Size
|
16GB 8GB 4GB 2GB
|
Memory Type
|
DDR3 SODIMM (204-Pin)
|
Speed
|
1600MHz PC3-12800, 1333MHz PC3-10600
|
ECC Type
|
Non-ECC Unbuffered
|
Voltage
|
1.35V/1.5V
|
DDR3 laptop,8gb 1333, 8gb 1600, 4gb 1333, 4gb 1600, DDR3L notebook,SoDIMM Memory,1.35V DDR
MICROBITS TECHNOLOGY LIMITED , https://www.hkmicrobits.com