Computer-Assisted Analysis of Literary Texts; and more! Access to image-based resources is fundamental to research, scholarship and the transmission of cultural knowledge. Digital images are a container for much of the information content in the Web-based delivery of images, books, newspapers, manuscripts, maps, scrolls, single sheet collections, and archival materials.
To give scholars an unprecedented level of uniform and rich access to image-based resources hosted around the world, To define a set of common application programming interfaces that support interoperability between image repositories, and To develop, cultivate and document shared technologies, such as image servers and web clients, that provide a world-class user experience in viewing, comparing, manipulating and annotating images.
This course will introduce students to the basic concepts and technologies that make IIIF possible, allowing for guided, hands-on experience in installing servers and clients that support IIIF, and utilizing the advanced functionality that IIIF provides for interactive image-based research, such as annotation. This course is aimed at humanities scholars interested in tapping into the data streams and functionality offered by platforms and content providers such as Twitter, Google, and the New York Times.
Introduction to APIs will open with the basics of Python, a scripting language widely used in industry and the academy because of its human readability. We will proceed to the fundamentals of working with Application Programming Interfaces APIs , the most common way to programatically access webbased services and data. The course will be useful for those interested in understanding programming concepts, developing applications, and working with data.
Participants will use DH Box, a cloudbased digital humanities laboratory, for their development environment. This course teaches participants how to use ethical visualization principles and practices to visualize treacherous, or culturally problematic, data. Such data includes racist historical documents, ideologically laden materials, culturally controversial texts, politically charged topics, gendered works, etc.
Aimed at people who work with culturally sensitive datasets, and those who are interested in critical reflection on visualization practice, the course will combine hands-on activities and discussion. Participants will create data visualizations using R and instructor-provided stock code, and then interrogate their visualizations, identifying the extent and severity of the ethical pitfalls they inevitably contain.
By the end of the week, participants will have produced several visualizations and prepared a position statement on ethical visualization appropriate for their own cultural and disciplinary contexts.
No previous knowledge in coding, R, or visualizations is required. Participants are welcome to bring their own treacherous data, or they may use sample projects provided by the instructors. If you are unsure as to whether your data will work in this class, please feel welcome to contact the instructors in advance.
This course explores how how opening access to data changes the digital humanities project. We will cover the reasons for publishing open data, how we can create open data, and how we can work with open data. We will see how linked open data allows us to share data and incorporate data from other projects.
We will learn about data models, data formats, and software tools for working with linked open data. Consider this offering to in relation to the following. Applied Theories and Methods good for evaluating the vocabularies that we find ; Feminist Digital Humanities: Theoretical, Social, and Material Engagements good for evaluating vocabularies that we find ; Queer Digital Humanities: How to end and archive your digital project; Agile Project Management.
Wearable technology WT is moving closer to and even into the human body, effectively rendering it invisible. Therefore, this movement of technologies from our hands onto our skin should, but often does not account for our broader, felt experiences. In this seminar we will explore the central role of the palpability, of feeling of our active senses, in WT design. This seminar will begin with readings and discussion followed by small movement explorations to bring awareness to the rich information provided through our active, seeking senses.
The remainder of each session will be spent tinkering with existing wearable technologies and dreaming up designs for new WT through hands-on play. Participants will be able to explore and create with existing wearable technologies as well as various wearable microcontrollers, sensors and feedback output devices. No prior movement experience or experience with physical computing is assumed, but participants should come wearing comfortable clothing for movement explorations.
Not only virtual worlds but also existing 3D models can be explored and experienced anew by and in VR. In addition, the transfer of other media content into this environment is suitable, which currently implies few specifications for presentation. The partly still experimental use allows finding new ways of interaction and presentation. In this course, we want to try out new ways of representation and give the participants the opportunity to use WebVR for their work.
Video games are encoded with historical memory and interpretation, and they often serve to confirm a popular historical narrative or offer simple solutions to complex social and political moments. Students need the skills to challenge these common interpretations and be able to critically read these important cultural texts. Games allow students to employ traditional historical skills and learn about the interpretive nature of history. This class will explore the ways in which historically based video games can be used as a tool of historical inquiry and critical analysis, and as a means for helping students craft historical narratives.
Part one will explore existing examples of this pedagogical practice, discuss planning and curriculum development, and critically exam the theoretical and practical implications of the practice with students. Part two will be hands on with Twine and other light games engines to create our own games, which can be used as models for student work.
Here we take a disciplinary specific approach to video games and offer practical ways of implementing them in lower division survey courses and upper division research seminars. Participants will leave class with a model assignment, prototype Twine game, and practical advice for implementing the project in upper or lower division history curriculum. The purpose of this course is to improve our digital security, and our understanding of the digital security challenges in our lives as researchers.
This means ensuring best practices for our information security. This involves keeping our software up-to-date, using encryption to secure our machines and their communications including, but not limited to, SSH, mail, messaging e. Related to this is a matter of increasing importance in our current political and social climate: This will start with leveraging tools like Git or BitTorrent Sync and hopefully ipfs to mitigate the risks of simple deletion, and move to a discussion of encrypted backups.
Part of this discussion will focus on deploying infrastructure alternatives to institutional IT systems, using resources like the Internet Archive to capture our sites and data, and choosing a hosting provider that will defend your data and privacy.
Ultimately, by the end of the course, you should have a much stronger sense of the things you can do in your day to day lives to ensure that your data is secure and long-lived. This course provides fine arts experiential learning on location for independent observation, documenting writing, photography, poetry, filmmaking to refine our critical abilities as contemporary content creators, specifically as image-makers.
In addition, it will provide a thoughtful analysis of pedagogy around creative, collaborative learning and teaching practices i. Our pockets, hard drives, and web pages exceed our comprehension or management of imagery and text. This course will remove the screen between creativity and pedagogy and allow access to visual and textual approaches that can be applied in many number of disciplines or collaborative projects involving the digital humanities.
To eliminate editing software platform discrepancies in experience and expense, this course harnesses the DIY, indie spirit and will only require laptop, freeware editing, and a smartphone as recording hardware. This course includes informal writing and visual prompts that will be included in online videosketchbooks. They may take the form of response, research, verse, or assemblages, but are required to contain curated text and imagery.
Indigenisation, Introduction to GraphPoem. This course is an introduction to the XML ecosystem and its uses in research on literary and historical documents.
Participants will begin with the fundamentals of XML. This will be done using XML database and text-editing applications, all of which are open source.
Intended for researchers in the humanities and humanistic social sciences who are newcomers to DH methods, the course assumes no familiarity with scripting or encoding, though beginners might find the pace to be challenging, given the range of material presented. Students who are already familiar with TEI and website design would still benefit from the units on analytic tools, advanced XML, and xQuery, which are central to the course.
Metadata for Digital Humanities; and more! For those new to the field, this is an introduction to the theory and practice of encoding electronic texts for the humanities. This workshop is designed for individuals who are contemplating embarking on a text-encoding project, or for those who would like to better understand the philosophy, theory, and practicalities of encoding in XML Extensible Markup Language using the Text Encoding Initiative TEI Guidelines.
No prior experience with XML is assumed, but the course will move quickly through the basics. By providing an overview of textual creation, transmission, and preservation, this course will offer digital humanists an introduction to the methodologies and reference tools historical, codicological, and contemporary necessary to understand a book in its original contexts and thus to make informed encoding decisions for the digital era.
We will explore the technological shifts that made textual culture possible quill, ink, paper, illustration, TEI, etc. Attendees should bring clothing that can get dirty for a linocut workshop. This is a seminar style offering with hands-on elements.
Digital Humanities projects use more and more data every year. Databases are becoming increasingly important foundations for data analysis and data visualizations of all kinds. This course is about building and using databases, whether that means a small personal project like creating a reading list or managing large projects like wrangling unwieldy research materials, performing data science metrics, or analyzing social networks.
No prior programming experience is necessary. This course will explore models for doing DH at four year institutions. Over the past half-decade liberal arts colleges and four-year institutions have begun to engage in the development of robust programs in the digital humanities. With a focus on teaching, these institutions have also developed frameworks in which we can incorporate students into our research agendas in meaningful and productive ways.
Within a collaborative, interdisciplinary lens we will address approaches to teaching and research, developing models for sustainable infrastructure, student integration, project and resource management. Discussion will include administrative issues related to the recognition of collaborative efforts in DH.
Participation is encouraged from across all areas of the institution including library and IT professionals, administrators, and faculty. Individuals who are interested in growing digital humanities and digital scholarship in their unique institutional settings should attend. Guest lectures will be included as part of the course structure. For those new to the field, this is an introduction to the theory and practice of encoding electronic musical scores.
This course is designed for those who are interested in a music-encoding project, or for those who would like to better understand the philosophy, theory, and practicalities of encoding notated music in XML Extensible Markup Language using the Music Encoding Initiative MEI Guidelines.
Moreover, it will consider ways of incorporating sound and TEI files with encoded notation. Participants should have a basic knowledge of how to read music, but no prior experience with XML is assumed.
This course focuses on literacy with approaches to digital storytelling, fluency with resources, and making individual or collaborative digital stories. Course topics include storytelling as a fundamental human activity, combining storytelling techniques and computational technologies, organizing and managing digital storytelling projects, and using digital storytelling for Digital Humanities DH scholarship and pedagogy.
No previous experience with digital storytelling or associated platforms is necessary. Examples and resources are provided.
Participants will learn basic approaches and tool utilization, and may leverage these and other course resources for ongoing DH projects, or experiment freely.
A storytelling artifact collaborative or solo demonstrates course outcomes at the end of the week. Learn more at the course webpage: Introduction to Electronic Literature in DH: Modelling the Textual Universe Through Mapping: This course will question one of the most important practices in Digital Humanities, namely, digital mapping of texts. The students will go though an extensive model building experiment using the map exhibition tool Neatline. They will also create reports in the form of textual blogs and compare what can be expressed in each of the two media.
By comparing the different student projects in discussion sessions we will look into what kind of maps can be made based on different types of texts, and the degree to which mapping is meaningful for different texts. Through the course the students will understand better where the information we put on maps come from. To what degree is the information silently adjusted to fit the map medium? How much of what we express in text and as maps are steered by the medium?
Through this course the students will not only learn how to make map exhibitions based on texts but will also explore how modelling in the form of media transformation can be used as a text analysis tool. This is an intermediate-to-advanced course in stylometry: While stylometry has been usually associated with authorship attribution, recent research shows that the same methods can be used in a much broader context of literary study.
The statistics of such text features as word, word n-gram or letter n-gram frequencies, apart from being a highly precise tool for identifying authorship, can in fact present patterns of similarity and difference between various works by the same author; between works by different authors, between authors differing in terms of chronology or gender or genre or narrative styles; between translations of the same author or group of authors; between dialogic voices in novels.
This in turn provides a new opening in literary studies; and the results of stylometry can be compared and confronted with the findings of traditional stylistics and interpretation. The participants will be able to learn some of the more useful stylometric tools and methods, from simple wordlist-making to multivariate analyses of word and phrase frequencies to complex graphs and networks. The texts will be literary, multilingual, and include both originals and translations.
This course will survey pertinent research in Open Access OA methods, theory, and implementation, and it will look forward to open social scholarship. Overall, we will consider the role of OA knowledge dissemination in academia and at large. Specific topics of discussion include advocacy, infrastructure, intellectual property rights, research evaluation metrics, online journals, databases, and peer review methods and limitations in this context.
Using OA as a foundation, we will discuss the rising trend and potential impact of open social scholarship, which involves the creation and dissemination of research and technologies to a broad, interdisciplinary audience of specialists and non-specialists. This course will be geared toward students, librarians, scholars, publishers, government representatives, and others who are invested in the open development and sharing of research output.
Consider this offering to in complement with: Digital games are often studied as texts, as objects of research. However, given that games can function as simulations, models, arguments and creative collaboratories, game-based inquiry can be used as a potential method of humanities research, communication and pedagogy.
This course will explore the ways that simple game environments can be used as research, reporting and teaching tools that involve broad communities of players and publics in creative problem solving, open social scholarship, scholarly communication, and engaged and immersive learning.
Participants will be introduced to the affordances and constraints of multiple game types, including transmedia gaming, alternate reality games, vast narrative games and serious games. We will explore existing examples, discuss realistic planning, development and outcome logistics, and critically engage with the theoretical and practical implications of game-based scholarly engagement as participants work towards the development of their own prototypes which may or may not be exclusively digital.
The course will offer a forum for such a consideration. Questions to be considered include: What does queer studies bring to DH? How can we understand DH as already queer? What other relevant perspectives exist?
These questions can be grouped under two larger areas of inquiry. The first asks us to consider the ontology of DH definitions, concepts, terms, tools, and methodologies.
The second centers on the lines between the work that we do and also how we do it, both queering DH tools and methods and working with queer folks, identities, and communities.
These two areas of inquiry will structure the course, providing participants with opportunities to discuss and debate readings and ideas, as well as engage in hands-on explorations of digital tools, programming, classification systems, protocols and best practices in working with queer communities and artifacts. XML has become a widely adopted format for data interchange -mainly due its simplicity and adaptability for general use.
However, other programming languages present alternatives to this task that with a different workflow over XSLTs. The scope of this workshop is not limited to only XML. Because of the nature of this workshop, experience with programming is necessary.
This workshop is based on Python, so prior experience will definitely be an asset. Participants are also encouraged to bring their own projects to the workshop to use in place of the provided data sets.
Electronic literature is described as born digital literary work——that is, literature produced with and only experienced on a computing device. It combines seminar and workshop methodologies so that participants gain the background needed to critique and interpret and teach electronic literature with knowledge of its production.
This course uses an anti-colonial framework to analyze the ethics surrounding physical and digital surveillance methods, including the use of algorithms, biometrics, social media, and physical data.
We will examine the ways in which communities experience surveillance differently, based on factors such as race, ethnicity, gender, sexuality, and socioeconomic status. This course will introduce you to many techniques available to process, analyze, and visualize textual data with Python. You will be introduced to the theory and method of a discipline within computer science called Natural Language Processing NLP. We will discover why Python is an excellent language for text analysis and why Python 3 has some advantages over its predecessor, Python 2.
The NLTK is a large library of tools and resources that will allow us to conduct part-of-speech tagging, sentiment analysis, entity recognition, and text classification. Experience with Python is not strictly required for participation in the class, but a general understanding of programming methods and terms will be an asset.
Generally speaking, this class will help you think about humanities problems through computation. More specifically, you will understand the kinds of questions we can answer with NLP techniques and methods. This course will be relevant to anyone who wants to get a LAMP Linux, Apache, MySQL, PHP environment up-and-running for either testing or project deployment - or, perhaps, to those who want to speed up the implementation of projects that have been held back for institutional or financial reasons.
This course requires no prior knowledge, just a willingness to try and occasionally fail at new things. While some familiarity with the Linux environment will help with more advanced topics, no prior experience is needed. We will quickly review how to use the command line, as well as the fundamental principles of file and user permissions. As you might expect, the course will involve a significant hands-on technical portion.
Other topics will include: Traditional scholarly approaches will still have their place among these new media objects but will frequently need to be used in conjunction with methods for handling large volumes of new media. This course answers these questions by starting from a basic introduction to media types and their potential research value and then leading the hands-on process for building a pipeline for processing each, from collecting the material through to processing it and finally storing it.
Exact sources of the media to be used are still being considered but still images, sound files, and video will all feature prominently. No previous experience working with media files of any type is required but would certainly be an asset.
Big Data is the current Big Thing but this does not mean that it is generally well understood or straightforward. This general state of affairs generally leaves researchers with at least three questions: Where can they get Big Data?
How should they process Big Data? Where can Big Data processing be done? A course that addressed these questions while giving participants hands on experience with actual datasets that have volume, variety, veracity, velocity, etc. This course is intended for researchers who are looking to handle datasets that they can no longer comfortably process on a desktop computer, typically because the data no longer works well with standard tools or data storage formats.
To get the most out of this course participants should have a general familiarity with unix style command line interfaces and have a project or two either in mind or at hand that they suspect could benefit from what this course provides.
Emphasis is split between identifying tools and methods to handle such data based on the properties of that data as well as the skills of the research team and the systems available to handle it and hands on work that will generate understanding and impart relevant skills. Participants will be exposed to the creation of big data sets through web scraping and data aggregation, data formats for handling these results, processing techniques such as parallel processing, Hadoop, H Base, and Spark that can be run on Compute Canada systems, and attention to output formats for visualization.
Valuable for those considering advanced graduate or doctoral studies leading to a career in higher education. Students writing a thesis must demonstrate a good understanding of research methods and the ability to apply those methods to a research project. All MFA or M. Doctoral dissertations are designated COM In this exploding age of information, it is the objective of the library faculty to prepare graduates to be on the cutting edge of information technology.
Information literacy is the ability to effectively access information for problem solving and decision-making; thus, the knowledge and abilities you glean from this course will open doors to lifelong learning. It is imperative for graduate study research. Since the information learned in this course is a vital foundation for all other coursework, its completion is required within the first semester of study.
The course takes approximately ten hours to complete. All new students are expected to check-in for the semester 2 weeks prior to the session start date. Students should apply, be accepted, enroll in their first courses, and confirm a plan to pay for their courses prior to this date. Submit your application using our Regent University Online Application. If you are unable to complete our application due to a disability, please contact our Admissions Office and an admissions representative will provide reasonable accommodations to assist you in completing the application.
We are able to examine and view your unofficial transcript from a U. Please submit your unofficial transcript to our Admissions Office by email to transcripts regent.
For further details, please review the International Admissions Checklist on the International Students Admissions page. Please visit the International Students Admissions page for a more detailed explanation of the Regent University application information and to determine whether or not you qualify as an international student. Please email to the Admissions Office at apply regent. To ensure academic integrity, Regent University requires a copy of a government-issued ID.
If you would prefer to take a picture of your government-issued ID and email that to our office, please attach your ID and email to apply regent. All items submitted as part of the application process become the property of Regent University and cannot be returned.
Whether you are a prospective student or a current student, your questions matter. Please take a few moments to skim our Frequently Asked Questions. If you cannot find the answer to your question, please contact us. Choose a Program A. Choose a Program B. Master of Laws LL. Master of Theology Th. Choose a Program Doctor of Education Ed. Choose a Program Doctor of Ministry D. Enhance Your Communications Career. Increase Your Earning Potential Advance in leadership roles with higher salary potential.
Enhance Your Expertise Be equipped to serve in influential media and management positions. Expand Your Audience Learn to produce communication across a wide cultural, social and economic spectrum. Apply current internet, social media, and mobile media marketing strategies, tools and practices. Apply research methods, evaluation and analysis techniques to gain an insight into media trends.
Assist organizations with various communication needs, from public relations and crisis communication to combining entertainment and education in training modules. Organizational Communication in the Digital Age. Critical Approaches in Strategic Communication. Regent Foundations for Graduate Success. Application Fee Option 1: We will notify you if your previous institution will not release transcripts directly to us.
Describe your professional and career goals and how graduate OR doctoral — level coursework from the School of Communication and the Arts will help you facilitate your objectives.
Managing across culture. Its impact on people and business. Summary about Various levels of culture. Topic compared on Cross culture communication and Quality management "a cross culture outlook".
The American Communications Journal will help you find a Communications Degree from an Accredited Online School. We also provide Job Postings that are updated daily for career seekers.
Our culture and communication courses prepare you to explore how our culture and the ways we communicate enable us to express ourselves, create new things, make sense of our surroundings, and connect with each other. Students save money by purchasing this bundle which includes a loose-leaf version of Media Now: Understanding Media, Culture, and Technology, 10th Edition and access to MindTap Mass Communication.
Get essay on cross cultural management or intercultural management assignment help, defining their benefits and features, cross cultural management coursework done by our professors. Natural and Mathematical Sciences top. Our programs in the natural and mathematical sciences are among the country’s best. We offer you unique opportunities to learn from and conduct research with top experts in high-impact areas, from energy and the environment to medicine and data analytics.