You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The description the process sounds perfect for my application so I am testing out the paid version.
I loaded roughly 1000 page .txt file to Summarize.wtf. It took about a minute but did not return a summary of the content of the .txt.
About the session (all I could find):
Your Usage this Month
1/200 Questions
3.15 MB/1600 MB Uploaded
1101/10000 Pages
Plus Tier Subscription
TOTAL TOKENS USED: 13031
The return from the interface
The input text consists of various chunks, each containing information about different aspects of a document or article. Here is a breakdown of the key points from each chunk:
Chunk ID: 5181699561357312-0
The chunk contains formatting information for the document.
Chunk ID: 5181699561357312-1
Provides information about the item type and note associated with it.
Chunk ID: 5181699561357312-2
Indicates the date added and modified for the item.
Chunk ID: 5181699561357312-3
Provides additional information about the modification date.
Chunk ID: 5181699561357312-4
Contains annotations and quotes related to public health policy.
Chunk ID: 5181699561357312-12
Discusses the need for more research on interventions related to public health policy in schools.
Chunk ID: 5181699561357312-56
Talks about the concept of placekeeping and its role in creating equitable local food systems.
Chunk ID: 5181699561357312-57
Provides information about the item type, which is a blog post.
Chunk ID: 5181699561357312-58
Gives the author's name for the blog post.
Chunk ID: 5181699561357312-60
Provides information about the language in which the blog post is written.
Chunk ID: 5181699561357312-63
Indicates the date added and modified for the blog post.
Chunk ID: 5181699561357312-64
Provides additional information about the modification date.
Chunk ID: 5181699561357312-72
Discusses the school food environment and its impact on children's dietary behaviors.
Chunk ID: 5181699561357312-74
Presents an abstract for a journal article related to community food security through gardening.
Chunk ID: 5181699561357312-75
Talks about the author's name for the journal article.
Chunk ID: 5181699561357312-78
Provides a brief overview of a conference, including the number of attendees and papers presented.
Chunk ID: 5181699561357312-110
Indicates the item type as a journal article.
Chunk ID: 5181699561357312-113
Provides the author's name for the journal article.
Chunk ID: 5181699561357312-132
Gives a summary of the abstract related to nutrition interventions in the Philippines.
Chunk ID: 5181699561357312-147
Provides information about the number of abstracts submitted and presentations at a conference.
Chunk ID: 5181699561357312-160
Lists the tags associated with the item.
Chunk ID: 5181699561357312-175
Gives the authors' names for a journal article on addressing community food security through gardening.
Chunk ID: 5181699561357312-110
Provides formatting information for the document.
These chunks provide details about the type of item, authorship, dates, abstracts, and tags associated with the items. Overall, the input text provides a comprehensive understanding of the different aspects covered in the document or article
The text was updated successfully, but these errors were encountered:
I tried a second time with an unformatted .txt file and got a more detailed return.
You may need a check on the formatting of the document or a reformatting step so that the 'chunks' are not the focus of the process.
The description the process sounds perfect for my application so I am testing out the paid version.
I loaded roughly 1000 page .txt file to Summarize.wtf. It took about a minute but did not return a summary of the content of the .txt.
About the session (all I could find):
Your Usage this Month
1/200 Questions
3.15 MB/1600 MB Uploaded
1101/10000 Pages
Plus Tier Subscription
TOTAL TOKENS USED: 13031
The return from the interface
The text was updated successfully, but these errors were encountered: