Cloudy with snow showers developing after midnight. Low 29F. Winds E at 5 to 10 mph. Chance of snow 50%. Snow accumulations less than one inch..
Cloudy with snow showers developing after midnight. Low 29F. Winds E at 5 to 10 mph. Chance of snow 50%. Snow accumulations less than one inch.
Updated: November 14, 2022 @ 5:41 pm
This past June, Microsoft released Github Copilot, its new cloud-based artificial intelligence, which makes code suggestions based on a user-provided description of the code that they would like the AI to write for them. This AI was trained on all of the publically viewable code repositories on Github, many of which are open source or otherwise permissively licensed. As a brief refresher, open source in the computer software realm means you can see the actual source code of a program, as opposed to the pre-compiled binaries that only a computer can read. While programs on Github are often unequivocally open source and are subsequently accessible to the public with no reservations, there are many binding conditions contained within their licenses which can be legally enforced, and historically have been.
Take, for example, the GNU General Public License v3.0 (GNU GPLv3), which permits inherited code under its license to be used for commercial purposes, but requires that the source of said code must also be available. It also requires the inclusion of the same GPL license in the code which is being written. These rules also protect the author of the code from liability regarding the use of their software, so that they themselves are not forced to upkeep their program through bug fixes and security updates. Since Github Copilot bases its recommended code on sources that are unknown to the user, therefore the risk that the user acquires is not apparent.
Beyond the legal problems coming from the licensing requirements, the release of this software has caused many ethical questions to arise regarding the use of other people’s intellectual property to be used as a training dataset for such artificial intelligence programs. Microsoft is making a profit by having people pay to use this software, which uses an AI to effectively regurgitate code that other users made, with no mention of nor any monetary compensation sent to the actual creator of the code.
Furthermore, such AI programs limit the need for human intellect towards such nuanced logical problems. This technology has the potential to be used to restrict the overall need for programmers and the role they play in society by piggy-backing off of the work of others. This is comparable to the same issues that fine arts fields are beginning to experience with the creation of musical pieces that sound like Bach, but are really created by an artificial intelligence.
All in all, the creation and marketing of this product further highlights the immorality of big tech corporations. Microsoft itself relies on closed source programming for its own products, until it can leech off of other developers’ open source works in order to turn a quick profit. Github is a massive repository of programming knowledge meant to be accessible by the public. However, it is preposterous to allow Microsoft to illegally manipulate the use of such open source and free programs for profit. The people who put work into such code-snippets are left with no due credit and their small and simple licensing rules are cast aside and ignored.
The views expressed are those of the writer and do not necessarily reflect those of The Torch.
Your comment has been submitted.
There was a problem reporting this.
Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.