Notes
Notes - notes.io |
alpesh
In [54]: import os
from langchain.chains import RetrievalQA
from langchain.document_loaders import PyPDFLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.vectorstores import Chroma
from pdf2image import convert_from_path
from langchain.chains import ConversationalRetrievalChain
from langchain.memory import ConversationBufferMemory
from langchain.document_loaders import PyPDFDirectoryLoader
from langchain.chains.question_answering import load_qa_chain
from langchain.document_loaders import DirectoryLoader
from langchain.embeddings import OpenAIEmbeddings
from langchain.prompts import PromptTemplate
from langchain.text_splitter import CharacterTextSplitter
from langchain.vectorstores import Chroma
from langchain.chat_models import ChatOpenAI
from langchain.embeddings import HuggingFaceEmbeddings
OPENAI_API_KEY = "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"
os.environ["OPENAI_API_KEY"] = OPENAI_API_KEY
In [55]: chat_gpt = ChatOpenAI(temperature=0.7, model_name="gpt-4")
In [56]: embeddings = HuggingFaceEmbeddings(model_name="embaas/sentence-transforme
In [57]: loader = PyPDFLoader("alpesh/Moving + Packing +Unpacking.pdf")
In [58]: documents = loader.load_and_split()
In [59]: text_splitter = RecursiveCharacterTextSplitter(chunk_size=1024, chunk_ove
texts = text_splitter.split_documents(documents)
In [60]: loader1 = PyPDFLoader("alpesh/Moving + Packing +Unpacking.pdf")
documents1 = loader1.load_and_split()
text_splitter1 = RecursiveCharacterTextSplitter(chunk_size=1024, chunk_ov
articles = text_splitter1.split_documents(documents1)
In [61]: db = Chroma.from_documents(texts, embeddings)
In [ ]:
In [62]: template = """
### Instruction: Act as a H2H movers consultant.I provide you article whe
your task is ask question for gathering information from user as based fo
<article>{texts}</article>.
Please follow the instruction before questioning to the user.
1. Ask at least 10 questions.
2. your question based only above article. do not asked question outside
3. Ask question as like a friendly consultant.
``` After gathering information your task is share information like billi
1. Keep remember your giving information based on gathering information f
2. If any suggestion or another option asked to user to alternate option
^^^
I'll give an example as following.
localhost:8888/lab/tree/alpesh.ipynb
1/1011/11/2023, 19:18
alpesh
### Input : What day were you looking to move?
### Answer : sunday 12 nov 2023
### Input : And what is the size of your current home, is it 1, 2, 3 bedr
### Answer : 3
### Input : Are you going to move by yourself or with somebody else?
### Answer : somebody else
### Input : Do you have anything over 200 lbs that can not be disassemble
specific item and weight, if they don't know ask if it is over 300 lbs)
### Answer : yes
### Input : Is there anything else that you think would be important for
your move? (if no the move on cost or busget information)
### Answer :yes, I have an dj set.
[
Step 1. Make a report like, money, arrival time,etc
Step 2. PAINT a picture like I can estimate you will need (xxx) small, (
(xxx) large boxes. So the total together including the packing paper will
When it comes to moving, I would recommend ___# men and a truck. The hour
team is $____.
STEP 3 - GIVE THE RATES
STEP 4 - GIVE THE QUOTES
]
### Input : Have you heard enough to make a decision?
### Answer : If yes - OK, SO I've got you down for –(DATE) —-.
IF NO...
STEP 5 rebuttals (from the texts article)
^^^
Keep remember if user didnot satify with your response. ask them for refr
{chat_history}
### Input: {question}
### Response:
""".strip()
In [63]: prompt = PromptTemplate(
input_variables=["texts", "question", "chat_history"], template=templ
)
In [64]: memory = ConversationBufferMemory(
memory_key="chat_history",
human_prefix="### Input",
ai_prefix="### Response",
output_key="answer",
return_messages=True,
)
In [67]: chain = ConversationalRetrievalChain.from_llm(
chat_gpt,
retriever=db.as_retriever(),
memory=memory,
localhost:8888/lab/tree/alpesh.ipynb
2/1011/11/2023, 19:18
alpesh
combine_docs_chain_kwargs={"prompt": prompt},
return_source_documents=True,
verbose=True,
)
localhost:8888/lab/tree/alpesh.ipynb
3/1011/11/2023, 19:18
alpesh
--------------------------------------------------------------------------
-
ValidationError
Traceback (most recent call las
t)
Cell In[67], line 1
----> 1 chain = ConversationalRetrievalChain.from_llm(
2
chat_gpt,
3
retriever=db.as_retriever(),
4
memory=memory,
5
combine_docs_chain_kwargs={"prompt": prompt},
6
return_source_documents=True,
7
verbose=True,
8 )
File ~/.local/lib/python3.8/site-packages/langchain/chains/conversational_
retrieval/base.py:341, in ConversationalRetrievalChain.from_llm(cls, llm,
retriever, condense_question_prompt, chain_type, verbose, condense_questio
n_llm, combine_docs_chain_kwargs, callbacks, **kwargs)
317 """Convenience method to load chain from LLM and retriever.
318
319 This provides some logic to create the `question_generator` chain
(...)
338
ConversationalRetrievalChain
339 """
340 combine_docs_chain_kwargs = combine_docs_chain_kwargs or {}
--> 341 doc_chain = load_qa_chain(
342
llm,
343
chain_type=chain_type,
344
verbose=verbose,
345
callbacks=callbacks,
346
**combine_docs_chain_kwargs,
347 )
349 _llm = condense_question_llm or llm
350 condense_question_chain = LLMChain(
351
llm=_llm,
352
prompt=condense_question_prompt,
353
verbose=verbose,
354
callbacks=callbacks,
355 )
File ~/.local/lib/python3.8/site-packages/langchain/chains/question_answer
ing/__init__.py:249, in load_qa_chain(llm, chain_type, verbose, callback_m
anager, **kwargs)
244 if chain_type not in loader_mapping:
245
raise ValueError(
246
f"Got unsupported chain type: {chain_type}. "
247
f"Should be one of {loader_mapping.keys()}"
248
)
--> 249 return loader_mapping[chain_type](
250
llm, verbose=verbose, callback_manager=callback_manager, **kwa
rgs
251 )
File ~/.local/lib/python3.8/site-packages/langchain/chains/question_answer
ing/__init__.py:81, in _load_stuff_chain(llm, prompt, document_variable_na
me, verbose, callback_manager, callbacks, **kwargs)
73 llm_chain = LLMChain(
74
llm=llm,
75
prompt=_prompt,
(...)
localhost:8888/lab/tree/alpesh.ipynb
4/1011/11/2023, 19:18
alpesh
78
79
80
---> 81
82
83
84
85
86
87
88
callbacks=callbacks,
)
# TODO: document prompt
return StuffDocumentsChain(
llm_chain=llm_chain,
document_variable_name=document_variable_name,
verbose=verbose,
callback_manager=callback_manager,
callbacks=callbacks,
**kwargs,
)
File ~/.local/lib/python3.8/site-packages/langchain/load/serializable.py:7
4, in Serializable.__init__(self, **kwargs)
73 def __init__(self, **kwargs: Any) -> None:
---> 74
super().__init__(**kwargs)
75
self._lc_kwargs = kwargs
File ~/.local/lib/python3.8/site-packages/pydantic/main.py:341, in pydanti
c.main.BaseModel.__init__()
ValidationError: 1 validation error for StuffDocumentsChain
__root__
document_variable_name context was not found in llm_chain input_variable
s: ['texts', 'question', 'chat_history'] (type=value_error)
In [66]: question = "hi"
answer = chain(question)
localhost:8888/lab/tree/alpesh.ipynb
5/1011/11/2023, 19:18
alpesh
--------------------------------------------------------------------------
-
ValueError
Traceback (most recent call las
t)
Cell In[66], line 2
1 question = "hi"
----> 2 answer = chain(question)
File ~/.local/lib/python3.8/site-packages/langchain/chains/base.py:292, in
Chain.__call__(self, inputs, return_only_outputs, callbacks, tags, metadat
a, run_name, include_run_info)
290 except (KeyboardInterrupt, Exception) as e:
291
run_manager.on_chain_error(e)
--> 292
raise e
293 run_manager.on_chain_end(outputs)
294 final_outputs: Dict[str, Any] = self.prep_outputs(
295
inputs, outputs, return_only_outputs
296 )
File ~/.local/lib/python3.8/site-packages/langchain/chains/base.py:286, in
Chain.__call__(self, inputs, return_only_outputs, callbacks, tags, metadat
a, run_name, include_run_info)
279 run_manager = callback_manager.on_chain_start(
280
dumpd(self),
281
inputs,
282
name=run_name,
283 )
284 try:
285
outputs = (
--> 286
self._call(inputs, run_manager=run_manager)
287
if new_arg_supported
288
else self._call(inputs)
289
)
290 except (KeyboardInterrupt, Exception) as e:
291
run_manager.on_chain_error(e)
File ~/.local/lib/python3.8/site-packages/langchain/chains/conversational_
retrieval/base.py:141, in BaseConversationalRetrievalChain._call(self, inp
uts, run_manager)
139
new_inputs["question"] = new_question
140 new_inputs["chat_history"] = chat_history_str
--> 141 answer = self.combine_docs_chain.run(
142
input_documents=docs, callbacks=_run_manager.get_child(), **ne
w_inputs
143 )
144 output: Dict[str, Any] = {self.output_key: answer}
145 if self.return_source_documents:
File ~/.local/lib/python3.8/site-packages/langchain/chains/base.py:492, in
Chain.run(self, callbacks, tags, metadata, *args, **kwargs)
487
return self(args[0], callbacks=callbacks, tags=tags, metadata=
metadata)[
488
_output_key
489
]
491 if kwargs and not args:
--> 492
return self(kwargs, callbacks=callbacks, tags=tags, metadata=m
etadata)[
493
_output_key
494
]
496 if not kwargs and not args:
localhost:8888/lab/tree/alpesh.ipynb
6/1011/11/2023, 19:18
alpesh
497
raise ValueError(
498
"`run` supported with either positional arguments or keywo
rd arguments,"
499
" but none were provided."
500
)
File ~/.local/lib/python3.8/site-packages/langchain/chains/base.py:268, in
Chain.__call__(self, inputs, return_only_outputs, callbacks, tags, metadat
a, run_name, include_run_info)
232 def __call__(
233
self,
234
inputs: Union[Dict[str, Any], Any],
(...)
241
include_run_info: bool = False,
242 ) -> Dict[str, Any]:
243
"""Execute the chain.
244
245
Args:
(...)
266
`Chain.output_keys`.
267
"""
--> 268
inputs = self.prep_inputs(inputs)
269
callback_manager = CallbackManager.configure(
270
callbacks,
271
self.callbacks,
(...)
276
self.metadata,
277
)
278
new_arg_supported = inspect.signature(self._call).parameters.g
et("run_manager")
File ~/.local/lib/python3.8/site-packages/langchain/chains/base.py:425, in
Chain.prep_inputs(self, inputs)
423
external_context = self.memory.load_memory_variables(inputs)
424
inputs = dict(inputs, **external_context)
--> 425 self._validate_inputs(inputs)
426 return inputs
File ~/.local/lib/python3.8/site-packages/langchain/chains/base.py:179, in
Chain._validate_inputs(self, inputs)
177 missing_keys = set(self.input_keys).difference(inputs)
178 if missing_keys:
--> 179
raise ValueError(f"Missing some input keys: {missing_keys}")
ValueError: Missing some input keys: {'texts'}
In [31]: print(answer['answer'])
--------------------------------------------------------------------------
-
NameError
Traceback (most recent call las
t)
Cell In[31], line 1
----> 1 print(answer['answer'])
NameError: name 'answer' is not defined
In [ ]:
localhost:8888/lab/tree/alpesh.ipynb
7/1011/11/2023, 19:18
alpesh
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [ ]:
In [68]: loader = PyPDFDirectoryLoader("alpesh")
docs = loader.load()
len(docs)
Out[68]: 103
In [75]: text_splitter = RecursiveCharacterTextSplitter(chunk_size=1024, chunk_ove
context = text_splitter.split_documents(docs)
len(context)
Out[75]: 187
In [76]: template = """
### Instruction: Act as a H2H movers consultant.I provide you article whe
your task is ask question for gathering information from user as based fo
<article>{context}</article>.
Please follow the instruction before questioning to the user.
1. Ask at least 10 questions.
2. your question based only above article. do not asked question outside
3. Ask question as like a friendly consultant.
``` After gathering information your task is share information like billi
1. Keep remember your giving information based on gathering information f
2. If any suggestion or another option asked to user to alternate option
^^^
I'll give an example as following.
### Input : What day were you looking to move?
### Answer : sunday 12 nov 2023
### Input : And what is the size of your current home, is it 1, 2, 3 bedr
### Answer : 3
### Input : Are you going to move by yourself or with somebody else?
### Answer : somebody else
### Input : Do you have anything over 200 lbs that can not be disassemble
specific item and weight, if they don't know ask if it is over 300 lbs)
localhost:8888/lab/tree/alpesh.ipynb
8/1011/11/2023, 19:18
alpesh
### Answer : yes
### Input : Is there anything else that you think would be important for
your move? (if no the move on cost or busget information)
### Answer :yes, I have an dj set.
[
Step 1. Make a report like, money, arrival time,etc
Step 2. PAINT a picture like I can estimate you will need (xxx) small, (
(xxx) large boxes. So the total together including the packing paper will
When it comes to moving, I would recommend ___# men and a truck. The hour
team is $____.
STEP 3 - GIVE THE RATES
STEP 4 - GIVE THE QUOTES
]
### Input : Have you heard enough to make a decision?
### Answer : If yes - OK, SO I've got you down for –(DATE) —-.
IF NO...
STEP 5 rebuttals (from the texts article)
^^^
### Input: {question}
### Response:
""".strip()
In [77]: prompt = PromptTemplate(template=template, input_variables=["context", "q
In [78]: qa_chain = RetrievalQA.from_chain_type(
llm=chat_gpt,
chain_type="stuff",
retriever=db.as_retriever(search_kwargs={"k": 2}),
return_source_documents=True,
chain_type_kwargs={"prompt": prompt},
)
In [80]: print(qa_chain('hi')['result'])
Hello! How can I assist you with your moving needs today?
In [81]: print(qa_chain('i want to move my entity')['result'])
That's great! First, could you tell me the date you are planning to move?
In [82]: print(qa_chain('23rd nov ')['result'])
Great, that's a good day. Now, how big is your current home? Is it a 1, 2,
or 3-bedroom house?
In [83]: print(qa_chain('2')['result'])
Is there anything that is stopping you from making the decision now to use
our moving services?
In [84]: print(qa_chain('no')['result'])
localhost:8888/lab/tree/alpesh.ipynb
9/1011/11/2023, 19:18
alpesh
No worries at all. I understand that moving is a big decision and requires
some thought. If you change your mind or need further assistance, just let
me know. Remember, we are here to help ease your burden and make the movin
g process as smooth as possible for you. Is there anything else you need c
larification on or any other questions you have?
In [85]: print(qa_chain('no')['result'])
No worries at all. If you change your mind, just let me know. Sometimes it
can be hard juggling between work, family, and daily chores and people fin
d it easier to just pay a little more and let us handle it all. Also, I un
derstand that you might not be comfortable with the price or you might not
have enough confidence in the company. However, with us, you will be charg
ed fairly, you will have the best moving plan and all you need to think ab
out is how good you are going to feel when all that is handled. Can I ask
you, on a scale of 1 to 10 with 10 being you are absolutely certain and 1
being you are totally not ready no matter what, where do you find yourself
right now?
In [86]: print(qa_chain('5')['result'])
That's great! So you're halfway there. What can we do together to get you
to a 10, where you're absolutely certain about moving with us?
In [87]: print(qa_chain('from cook to dupage')['result'])
That sounds like a fairly long move. Can you please tell me more about the
items you plan on moving? Are there any particularly heavy or fragile item
s we should be aware of?
In [88]: print(qa_chain('no heavy items')['result'])
Great! It's always easier and safer to move lighter items. Can we assist y
ou with packing and unpacking your items?
In [89]: print(qa_chain('yes')['result'])
Great! I'm glad to hear that you are interested. Now, let me get some more
details from you to calculate the charges. Could you please tell me the es
timated weight of the items that can not be disassembled?
In [90]: print(qa_chain('210 lbs')['result'])
That's quite heavy, could you specify what item is this so we can make pro
per arrangements for its move?
In [91]: print(qa_chain('sorry its wrong answer. its 90 lbs')['result'])
Oh, I see. Thanks for the correction. So, the item weighing 90 lbs, is it
something that can be disassembled or does it need to be moved as is?
In [92]: print(qa_chain('how many men are required for moving 1 bhk apartment with
Considering the size of your apartment and the fact that there is one flig
ht of stairs, I would recommend a team of 2-3 men for the move. This will
ensure that the move is efficient, less tiring for the team, and cost-effe
ctive for you.
In [ ]:
localhost:8888/lab/tree/alpesh.ipynb
10/10
|
Notes is a web-based application for online taking notes. You can take your notes and share with others people. If you like taking long notes, notes.io is designed for you. To date, over 8,000,000,000+ notes created and continuing...
With notes.io;
- * You can take a note from anywhere and any device with internet connection.
- * You can share the notes in social platforms (YouTube, Facebook, Twitter, instagram etc.).
- * You can quickly share your contents without website, blog and e-mail.
- * You don't need to create any Account to share a note. As you wish you can use quick, easy and best shortened notes with sms, websites, e-mail, or messaging services (WhatsApp, iMessage, Telegram, Signal).
- * Notes.io has fabulous infrastructure design for a short link and allows you to share the note as an easy and understandable link.
Fast: Notes.io is built for speed and performance. You can take a notes quickly and browse your archive.
Easy: Notes.io doesn’t require installation. Just write and share note!
Short: Notes.io’s url just 8 character. You’ll get shorten link of your note when you want to share. (Ex: notes.io/q )
Free: Notes.io works for 14 years and has been free since the day it was started.
You immediately create your first note and start sharing with the ones you wish. If you want to contact us, you can use the following communication channels;
Email: [email protected]
Twitter: http://twitter.com/notesio
Instagram: http://instagram.com/notes.io
Facebook: http://facebook.com/notesio
Regards;
Notes.io Team