Skip to main content

FAQ

Why are prompts in English?

AiShort was created to make it easier for non-native English speakers to use ChatGPT. However, all the prompts are in English. This is because ChatGPT understands English much better compared to other languages. Even MOSS, China's first conversational large language model, acknowledged that its English responses are superior to its Chinese ones. Therefore, using English prompts is recommended. (MOSS is no longer publicly available)

While you might get decent results with prompts in other languages, you may receive vastly different outcomes when you input the same non-English prompt again. Because ChatGPT's comprehension of non-English languages varies each time, it is advisable to use English prompts for productivity-related tasks to ensure the quality of the output. Additionally, responses generated by English prompts are also likely to be in English. You can specify the response language by adding respond in Chinese at the end of your prompt. If your native language is different, please change "Chinese" to your mother tongue.

Do I have to input the prompt every time?

In the API, you can set the prompt as a "system prompt," so you won't need to input it for subsequent interactions. ChatGPT will follow the instructions set in the system prompt.

In the web version of ChatGPT, if you haven't switched the main prompt, you only need to enclose your subsequent content in quotation marks, eliminating the need to re-enter the prompt each time. When the response doesn't align with the prompt's requirements, it means ChatGPT has forgotten the prompt, and you'll need to re-enter it to reactivate it. Additionally, each chat has a unique link, so you can save frequently used conversations as bookmarks for future use.

The Difference Between GPTs and Prompts

The GPTs feature allows users to interact with the model through customized text instructions. The purpose of these instructions is to guide the model to generate outputs that are closely aligned with the user's needs. Therefore, these GPTs input instructions are functionally equivalent to prompts.

From a practical application standpoint, these GPTs input instructions essentially act as built-in system prompts. They not only direct the model to produce specific responses but also reflect the user's control and guidance over the model's expected behavior. In fact, many prompts in the community originate from built-in instructions within GPTs.

Search Input Delay

The web version of AI Short achieves a zero-delay search through manual searching, whereas the Docker or extension versions use automatic searching, which has an 800-millisecond delay.

This issue stems from the extension's search functionality being based on Docusaurus's showcase, which has a problem with losing focus on PC input methods. After reporting this to Docusaurus, they said they would try to fix it and, FWIW, you should not be using Chinese anyway, since the showcase is not localized. However, the problem was never resolved. Therefore, I divided the search component into two categories: mobile and PC. The search logic for mobile remains unchanged, while for PC browsing (screen widths above a 768px threshold), a debounce function was introduced to solve the input method issue. However, this creates two problems on PCs: first, non-English input must be completed within 800 milliseconds; second, the search refresh on PCs changes from instant to an 800-millisecond delay. If you have a better solution, please let us know via Feedback.

Outputting False Information

Although ChatGPT is very powerful, it is not omnipotent. Sometimes it outputs false information. For example, when I needed to enter hundreds of pieces of information into AiShort, I asked ChatGPT to convert the data into a specified format. During the process, I discovered that some of the information was incorrectly written by ChatGPT. For instance, a tag in the text was movie critic, but ChatGPT changed it to film critic. While this might not have a significant impact in text, it would cause an error in code. Therefore, it is crucial to check ChatGPT's output when using it.

How can I back up my prompts?

Logged-in users can use the export feature to back up the prompts they have created:

  1. Click the "Export Prompts" button in the upper right corner of the page.
  2. The system will automatically generate a JSON file containing all your custom prompts.
  3. The file will download automatically, containing the complete information and creation time of your prompts.

The prompt isn't working well?

If you are summarizing content, you can use GPT to further refine the original answer to improve its accuracy. Furthermore, prompts are not only useful for productivity but are also important for stimulating your thinking, helping you to approach problems from multiple angles and address issues that might be easily overlooked during the thought process.

All prompts on AiShort are sourced from the internet and the prompt library will be updated regularly. Although each prompt has undergone multiple rounds of testing, its actual effectiveness may vary depending on the user's needs. If you find any errors, have innovative ideas, or come across useful prompts, we welcome you to inform us via Feedback or share them in the community.