Skip to main content

AI Guidance for Communicators

See our best practices and guidelines for NC State communications and marketing professionals using artificial intelligence tools.

Professional marketers and communicators are increasingly using artificial intelligence (AI) to help them do their jobs. As society adapts to the changes caused by AI, UComm is providing AI usage guidance for individuals with communications and marketing responsibilities at NC State.

Follow these overarching guidelines to ensure AI is a help rather than a hindrance:

  1. Always keep a human involved. AI tools excel at helping with the early and middle portions of a project, such as brainstorming ideas, organizing and summarizing meeting notes, and analyzing analytics data. If you use generative AI to create content consumed by constituents, make sure it is thoroughly reviewed and edited by a human.
  2. Remember that AI can’t think. It simply looks for statistical probabilities of what content usually appears in relation to other content. Unlike spreadsheets and other computer programs that will always produce a correct answer, generative AI tools deal in probabilities, not certainties.
  3. The final result is your responsibility, not the AI’s. AI can be immensely helpful, but you are ultimately responsible for how its output is used.

Overview

Background and Context

The term AI refers to software that exhibits abilities normally associated with human intelligence, such as understanding natural language, recognizing patterns, making decisions and solving problems. However, AI tools cannot mirror the complexities of human reasoning. Generative AI tools do not create original work; rather, they make predictions based on vast amounts of existing content. Ultimately, AI cannot replace human creativity.

UComm is providing these guidelines for use by communications and marketing professionals at NC State University. They are not intended to govern other areas of the university, such as education and classroom settings, IT, chatbots, etc. They apply to AI tools that generate content, such as images, text, music, video and other similar items.

UComm AI Guiding Principles

  • We believe in a human-centered approach to AI that empowers and augments professionals. AI technologies are tools. They cannot replace thoughtful human decision-making, and they should be treated as assistive — not autonomous — technologies.
  • We believe that humans remain accountable for all decisions and actions that involve AI. All AI-generated material must be carefully overseen, reviewed, edited and approved by a human author, editor or designer.
  • We believe in the critical role of human knowledge, experience, emotion and imagination in creativity, and we seek to explore and promote emerging career paths and opportunities for creative professionals in all fields.
  • We believe in the power of communication to educate, influence and effect change. We commit to never knowingly using generative AI technology to deceive or spread misinformation.
  • We commit to verifying the accuracy of information supplied by AI. Nothing can replace the role of human fact-checkers, and we take responsibility for any AI-assisted information used in communications materials.
  • AI-generated materials have a high probability of having been trained on another person’s copyrighted or trademarked material. Therefore, we will take great care to ensure that the final product of any AI-generated material has been carefully reviewed and, where necessary, modified to avoid copyright or trademark infringement.
  • Transparency in AI usage is essential to maintaining the trust of our audiences and stakeholders.
  • We believe in the importance of upskilling and reskilling professionals and of using AI to increase productivity and efficiency while also building more fulfilling careers and lives.
  • We believe in partnering with organizations and people who share our principles.

Building AI Into Your Communications and Marketing Toolkit

AI can help with many different job duties. The suggestions below are not exhaustive, due to the constantly evolving nature of available AI tools and systems.

It is important to remember that AI tools are assistive, not autonomous. They are writing, brainstorming and content aids. They cannot replace the role and importance of the human in these tasks.

Writing and Content Creation

  • Brainstorming new story ideas: AI can help you generate fresh story ideas, and it can offer a different perspective or provide constructive feedback on existing concepts for content.
  • Creating an outline: AI can help organize content ideas into a cohesive structure.
  • Editorial calendar/content plan: AI can help you quickly organize and plan your content and social media calendars.
  • Helping with headers, headlines and other content structure and navigation: AI tools can help you identify common themes and provide draft ideas for headlines, subheads, website headers, H3 tags, etc.
  • Anticipating potential questions or objections: If you’re making a pitch or proposal, ask an AI tool to behave like an investigative journalist and suggest potential questions or objections from stakeholders so you can prepare responses in advance.
  • Assisting as an editor: AI tools can answer questions about editorial style. However, remember that some tools may not have access to the most recent version of regularly updated style guides and likely do not have access to NC State’s editorial guidelines.
  • Serving as a thesaurus: AI tools can help you identify alternatives to a given word, phrase or section of content.

Website and Digital Channel Management

  • Search engine optimization (SEO): AI tools in the marketing and communications realm can quickly assist with keyword research and help analyze factors like readability, keyword usage and relevancy to improve webpage quality and performance.
  • Helping draft social media posts: AI tools can be a great place to start for a quick first draft of social media posts. They can also help you tailor existing social media posts, comments, etc., to different audiences and drive engagement.
  • Personalizing messaging: AI tools can be adept at helping you rework your content to reach different audiences, such as students, staff, faculty, donors or the media. They can make suggestions for how to change language, shorten text or emphasize different targeted messages.
  • Repurposing content: Paste content that’s too long into an AI tool and ask it to identify areas you could cut. It will look for repetitions or places where shorter phrases would suffice. Note that humans should still review these suggestions, especially because the tool may suggest changes to quotes or adjust factual information.

Imagery, Video and other Visual Content

  • Don’t misrepresent: Just as with non-AI content, make sure you do not misrepresent NC State and its work. This includes, but is not limited to, making sure you don’t generate images of individuals, buildings or campus locations that don’t exist, and that you don’t substantially alter the appearance of real-world people or places.
  • Follow brand guidelines for visual identity: The visual identity of NC State’s brand is rooted in flat, mid-century modern motifs, and AI-generated content should follow the same visual identity guidelines as any other university communication material.
  • Use built-in tools: Photo editing applications increasingly have AI-powered tools that allow for easy image touch-ups. This is a great and encouraged use of AI technology, but these tools should not be used to give a false impression of NC State or our people.
  • Be careful with protected logos and marks: In most instances you should not generate wolf imagery, alternate logos or other visual marks that officially signify the university. Not only can this result in trademark issues, it can also cause confusion and dilute the NC State brand.
  • Ask: If you aren’t sure whether generated visuals might run afoul of university guidelines and policies, ask the UComm AI working group (ucomm-ai-group@ncsu.edu) to provide more context.

General Productivity

  • Enhancing productivity: If you follow NC State’s privacy and data security policies, AI tools can help you with routine tasks such as transcribing interviews, analyzing data or drafting outlines and text for presentations or publication. However, it is important to evaluate the resulting output to ensure it  meets professional standards for accuracy and quality and will help achieve unit and institutional strategic goals without creating reputational risk. For example, AI may be able to help you draft emails, but you should always review and edit AI-generated copy before disseminating it to your intended audience. In short, it is vital to treat AI as an assistant that helps you do your job, not a robot that does your job for you. For a cautionary example of a misuse of AI, see this news story about Vanderbilt University using AI to write an all-campus email after a mass shooting.
  • Meeting summaries: NC State Zoom accounts allow you to generate meeting summaries and auto-generate meeting notes. This can be helpful for recalling discussions and understanding next steps that were assigned. This functionality can be enabled within an individual’s Zoom account.
  • Google Workspace business productivity tool: Google Gemini can be used directly within Google Docs, Sheets, Slides, and Drive to assist with and streamline common daily tasks.

Responsible AI Use

All AI content that is used in an end product (e.g., an email to stakeholders, an annual report, a news story, etc.) should be thoroughly reviewed and edited by a human. Simply using AI to generate user-facing content is almost always discouraged. Instead AI should be leveraged as a tool to assist during the content planning process. Examples are how to use AI in this manner are available within the “AI Uses to Consider” section.

The items below enumerate more specific scenarios to consider:

  • University policies: AI tools should not be used in any way that would violate existing university standards or policies, such as communicating falsehoods (knowingly or otherwise), spamming/phishing or manipulating data to create a false impression.
  • University data management requirements: In general, any AI tool that you submit information to in the process of using it — such as asking a question for research or providing a prompt for output — might retain that information and use it to help train the tool. That means you should always be very careful about entering information into any AI tool because the tool will likely not guarantee the security or privacy of that data. NC State’s Office of Information Technology has approved a list of AI tools that can be used with green (not sensitive) and yellow (moderately sensitive) data as defined by OIT’s Data Management Framework. When using these approved AI tools, all data is protected and is not used to train public models. You must use your NC State account with these AI tools; personal accounts are not allowed to be used with university data.
  • Fact-checking: AI tools are outstanding research assistants, but they may “hallucinate” and suggest facts and sources that sound plausible but are partially or entirely inaccurate. Humans should verify all AI-provided facts, and human review must be an integral component of all research, planning and content creation.
  • Brand compliance: Like all content created on behalf of NC State, any content created with the help of AI must comply with all relevant university policies, rules and regulations, such as RUL 01.25.01 – NC State Brand Use and Protection. Content created with the help of AI must also adhere to NC State’s brand guidelines governing such matters as fonts, colors, design conventions, logo usage, editorial style, and voice and tone.
  • Imagery, video and audio creation: AI-generated images, music, audio and video should not be used in university communications materials. The legality of this practice is under review in the courts, and the ethics are dubious. Instead, AI can be used to help brainstorm art ideas and direction. Some artists are pursuing legal recourse against organizations using AI-generated art rather than against the AI companies themselves.
  • Public records: Assume any data entered into an AI tool could be made public through a public records request.

Contact Us

If you have any questions or would like to partner with UComm on any AI-related projects or initiatives, reach out to UComm’s AI working group (ucomm-ai-group@ncsu.edu).

Alastair Hadden

Acting Executive Director of Marketing

Scott Thompson

Director of Web Services

Brent Winter

Director of Editorial Services