Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fast_beam_search_LG outputs empty for sentence with oov words #1151

Open
Slyne opened this issue Jun 28, 2023 · 13 comments
Open

fast_beam_search_LG outputs empty for sentence with oov words #1151

Slyne opened this issue Jun 28, 2023 · 13 comments

Comments

@Slyne
Copy link

Slyne commented Jun 28, 2023

Hi developers,
I'm not sure if this issue is from my side or not.
I use k2 + fast_beam_search and can get good result. However, after adding the LG.pt to the decoding graph, I can get results from most audio files from librispeech test_clean dataset but for six sentences, I get empty result so I checked the empty hypothesis sentences (6 sentences) transcript:

george montfichet will never forget this day
montfichet called out for robin to give him an arm
there befell an anxious interview mistress fitzooth arguing for and against the squire's project in a breath
robin fitzooth saw that his doubts of warrenton had been unfair and he became ashamed of himself for harboring them
when the blueskins saw ghip ghisizzle they raised another great shout for he was the favorite of the soldiers and very popular with all the people
she is under sail but she is count timascheff's yacht he was right

I found 'montfichet', 'fitzooth', 'ghisizzle', and 'timascheff's' are not in words.txt.
I'm actually not sure if it's due to the OOV. Please help me figure out in what circumstances that it will output empty result? ngram_lm_scale has already been set as 0.0.

@csukuangfj
Copy link
Collaborator

fast_beam_search_LG outputs empty for sentence with oov words

If you use LG, it can only recognize words that are present in L, If it is not in L, then you will never find it in the output.

@pkufool
Copy link
Collaborator

pkufool commented Jun 29, 2023

@Slyne Can you first try decoding with allow_partial=True (it is a argument for fast_beam_search_onebest).

@Slyne
Copy link
Author

Slyne commented Jun 29, 2023

Thank you, guys!
@csukuangfj The output is completely empty.
@pkufool This works! I just didn't find the definition of the parameter so I set it to default value (false).

Was wondering what's the rule of thumb to set the parameters? For example,
if allow_partial=False, will it generate empty sentence if there's oov/word not in lexicon? Not sure if this behavior is expected or not.
if allow_partial=True and I use k2.shortest_path() to get the best path, will this incur more issue?

@desh2608
Copy link
Collaborator

desh2608 commented Jul 26, 2023

I have a possibly related issue. I am trying long-form decoding on TED-Lium dataset (30s chunks). I tried decoding with fast_beam_search_nbest and fast_beam_search_nbest_LG. The following table summarizes the results:

Decoding method Oracle segments 30s chunks (long-form)
fast_beam_search_nbest 7.61 5.85
fast_beam_search_nbest_LG 6.43 9.00

Notice that when decoding on the provided segments, using LG method gets better WER, but when decoding longer chunks, it gets much worse. I found that this increase in WER is mainly due to long deleted segments such as:

REF:  like pouring a color the way we might pour A liquid so in this case we've got three SIFTABLES CONFIGURED TO BE PAINT BUCKETS AND I CAN USE THEM TO POUR COLOR INTO THAT CENTRAL ONE WHERE THEY GET MIXED IF WE OVERSHOOT WE CAN POUR A LITTLE BIT BACK 
HYP:  like pouring a color the way we might pour * liquid so in this case we've got three ********* ********** ** ** ***** ******* *** * *** *** **** ** **** ***** **** **** ******* *** ***** **** *** ***** ** ** ********* ** *** **** * ****** *** SUFI 
Eval:                                            D                                        D         D          D  D  D     D       D   D D   D   D    D  D    D     D    D    D       D   D     D    D   D     D  D  D         D  D   D    D D      D   S    

Such long deletions do not happen when decoding on the original segments. I have set allow_partial=True for both methods.

Update: It seems the --max-contexts and --max-states parameters were incorrectly set (to lower values) for the LG decoding in the 30s chunk case. Decoding again with consistent parameter values.

After correcting the parameters, the LG decoding WER (on 30s chunks) improved to 8%, but I still see a lot of long deleted segments such as the above.

@desh2608
Copy link
Collaborator

@pkufool Do you have any suggestions about this?

@pkufool
Copy link
Collaborator

pkufool commented Jul 27, 2023

@desh2608 I don't have ideas about this, could you share the model and bad cases to me, I can help to debug this issue. BTW, I do have some small fixes to fast_beam_search, see k2-fsa/k2#1237 , maybe you can try it. I am not sure whether this will help your problems.

@desh2608
Copy link
Collaborator

Thanks. Let me try it again with the updated k2. If the problem persists, I will share a minimum reproducible example.

@desh2608
Copy link
Collaborator

I also see some related PRs: k2-fsa/k2#1134 and k2-fsa/k2#1218. Should I pull those as well?

@pkufool
Copy link
Collaborator

pkufool commented Jul 27, 2023

no, just the latest one

@desh2608
Copy link
Collaborator

@pkufool Unfortunately, the problem still persists even after pulling your latest changes. In order to help you reproduce the issue, I have created a package containing the following:

  • the model checkpoint (it is the latest zipformer, with causal=False)
  • a decoding batch --- the last utterance gets a lot of deletions
  • LG.pt
  • words.txt

Download link: Google Drive

Since I am decoding with artificial 30s chunks, the "utterances" do not have corresponding reference texts. However, based on the start and end time of the utterances, the problematic utterance is from 446s to 484s in the recording, which roughly includes the following segments from the STM file:

WadeDavis_2003 0 Wade_Davis 440.38 463.37 <F0_M> profound religious ideas that came over during the tragic diaspora of the slavery era but what makes voodoo so interesting is that it's this living relationship between the living and the dead so the living give birth to the spirits the spirits can be invoked from beneath the great water responding to the rhythm of the dance to momentarily displace the soul of the living so that for that brief shining moment the acolyte becomes the god that's why the voodooists like to say that
WadeDavis_2003 0 Wade_Davis 463.38 474.22 <F0_M> you white people go to church and speak about god we dance in the temple and become god and because you are possessed you are taken by the spirit how can you be harmed so you see these astonishing
WadeDavis_2003 0 Wade_Davis 474.22 482.69 <F0_M> demonstrations voodoo acolytes in a state of trance handling burning embers with impunity a rather astonishing example of the ability of the

I first thought the problem only happens with the last utterance in the batch for some reason, but other batches have utterances in the middle of the batch with long deleted segments.

Please let me know if you need anything else which can be useful to debug the problem.

@desh2608
Copy link
Collaborator

desh2608 commented Jul 27, 2023

I reverted the commit from k2-fsa/k2#1237 since it was causing errors during MWER training (see stack trace below).

[F] /exp/draj/jsalt2023/k2/k2/csrc/eval.h:148:void k2::EvalDevice(cudaStream_t, int32_t, LambdaT&) [with LambdaT = __nv_dl_wrapper_t<__nv_dl_tag<void (k2::rnnt_decoding::RnntDecodingStreams::*)(const std::vector<int>&, bool, k2::Ragged<k2::Arc>*, k2::Array1<int>*, k2::Array1<int>*, const k2::RaggedShape&), &k2::rnnt_decoding::RnntDecodingStreams::FormatOutput, 5>, int*, int*, int*, int*, int*, int*, int*, unsigned int*, int, k2::rnnt_decoding::ArcInfo**, int*, const k2::Arc* const*, k2::Arc*, k2::Array1<int>*, int*, const int*, const int*, int*, int>; cudaStream_t = CUstream_st*; int32_t = int] Check failed: e == cudaSuccess (700 vs. 0)  Error: an illegal memory access was encountered. 


[ Stack-Trace: ]
/exp/draj/jsalt2023/k2/build_debug/lib/libk2_log.so(k2::internal::GetStackTrace()+0x46) [0x2aab383ffb88]
/exp/draj/jsalt2023/k2/build_debug/lib/libk2context.so(k2::internal::Logger::~Logger()+0x35) [0x2aab3247ad65]
/exp/draj/jsalt2023/k2/build_debug/lib/libk2context.so(void k2::EvalDevice<__nv_dl_wrapper_t<__nv_dl_tag<void (k2::rnnt_decoding::RnntDecodingStreams::*)(std::vector<int, std::allocator<int> > const&, bool, k2::Ragged<k2::Arc>*, k2::Array1<int>*, k2::Array1<int>*, k2::RaggedShape const&), &k2::rnnt_decoding::RnntDecodingStreams::FormatOutput, 5u>, int*, int*, int*, int*, int*, int*, int*, unsigned int*, int, k2::rnnt_decoding::ArcInfo**, int*, k2::Arc const* const*, k2::Arc*, k2::Array1<int>*, int*, int const*, int const*, int*, int> >(CUstream_st*, int, __nv_dl_wrapper_t<__nv_dl_tag<void (k2::rnnt_decoding::RnntDecodingStreams::*)(std::vector<int, std::allocator<int> > const&, bool, k2::Ragged<k2::Arc>*, k2::Array1<int>*, k2::Array1<int>*, k2::RaggedShape const&), &k2::rnnt_decoding::RnntDecodingStreams::FormatOutput, 5u>, int*, int*, int*, int*, int*, int*, int*, unsigned int*, int, k2::rnnt_decoding::ArcInfo**, int*, k2::Arc const* const*, k2::Arc*, k2::Array1<int>*, int*, int const*, int const*, int*, int>&)+0x381) [0x2aab326c1c97]
/exp/draj/jsalt2023/k2/build_debug/lib/libk2context.so(void k2::EvalDevice<std::shared_ptr<k2::Context>, __nv_dl_wrapper_t<__nv_dl_tag<void (k2::rnnt_decoding::RnntDecodingStreams::*)(std::vector<int, std::allocator<int> > const&, bool, k2::Ragged<k2::Arc>*, k2::Array1<int>*, k2::Array1<int>*, k2::RaggedShape const&), &k2::rnnt_decoding::RnntDecodingStreams::FormatOutput, 5u>, int*, int*, int*, int*, int*, int*, int*, unsigned int*, int, k2::rnnt_decoding::ArcInfo**, int*, k2::Arc const* const*, k2::Arc*, k2::Array1<int>*, int*, int const*, int const*, int*, int> >(std::shared_ptr<k2::Context>, int, __nv_dl_wrapper_t<__nv_dl_tag<void (k2::rnnt_decoding::RnntDecodingStreams::*)(std::vector<int, std::allocator<int> > const&, bool, k2::Ragged<k2::Arc>*, k2::Array1<int>*, k2::Array1<int>*, k2::RaggedShape const&), &k2::rnnt_decoding::RnntDecodingStreams::FormatOutput, 5u>, int*, int*, int*, int*, int*, int*, int*, unsigned int*, int, k2::rnnt_decoding::ArcInfo**, int*, k2::Arc const* const*, k2::Arc*, k2::Array1<int>*, int*, int const*, int const*, int*, int>&)+0x42) [0x2aab326b663c]
/exp/draj/jsalt2023/k2/build_debug/lib/libk2context.so(k2::rnnt_decoding::RnntDecodingStreams::FormatOutput(std::vector<int, std::allocator<int> > const&, bool, k2::Ragged<k2::Arc>*, k2::Array1<int>*, k2::Array1<int>*, k2::RaggedShape const&)+0x1484) [0x2aab326a6c26]
/exp/draj/jsalt2023/k2/build_debug/lib/_k2.cpython-38-x86_64-linux-gnu.so(+0x219fcf) [0x2aab2da07fcf]
/exp/draj/jsalt2023/k2/build_debug/lib/_k2.cpython-38-x86_64-linux-gnu.so(+0x21ec17) [0x2aab2da0cc17]
/exp/draj/jsalt2023/k2/build_debug/lib/_k2.cpython-38-x86_64-linux-gnu.so(+0x21e3b0) [0x2aab2da0c3b0]
/exp/draj/jsalt2023/k2/build_debug/lib/_k2.cpython-38-x86_64-linux-gnu.so(+0x21dd0d) [0x2aab2da0bd0d]
/exp/draj/jsalt2023/k2/build_debug/lib/_k2.cpython-38-x86_64-linux-gnu.so(+0x21ddbd) [0x2aab2da0bdbd]
/exp/draj/jsalt2023/k2/build_debug/lib/_k2.cpython-38-x86_64-linux-gnu.so(+0x9d5f9) [0x2aab2d88b5f9]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(PyCFunction_Call+0x52) [0x4f5652]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyObject_MakeTpCall+0x3bb) [0x4e0c8b]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f53fd]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x49a9) [0x4dc999]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f51bb]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x1150) [0x4d9140]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyFunction_Vectorcall+0x19c) [0x4e807c]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x1150) [0x4d9140]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyFunction_Vectorcall+0x19c) [0x4e807c]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f548e]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(PyObject_Call+0x34e) [0x4f778e]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x1f7b) [0x4d9f6b]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyFunction_Vectorcall+0x19c) [0x4e807c]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyObject_FastCallDict+0x282) [0x4e0442]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyObject_Call_Prepend+0x60) [0x4f1fd0]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x5ab347]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(PyObject_Call+0x2cf) [0x4f770f]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x1f7b) [0x4d9f6b]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyFunction_Vectorcall+0x19c) [0x4e807c]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f548e]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(PyObject_Call+0x34e) [0x4f778e]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x1f7b) [0x4d9f6b]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyFunction_Vectorcall+0x19c) [0x4e807c]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f548e]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(PyObject_Call+0x34e) [0x4f778e]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x1f7b) [0x4d9f6b]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalCodeWithName+0x2f1) [0x4d6fb1]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyFunction_Vectorcall+0x19c) [0x4e807c]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyObject_FastCallDict+0x282) [0x4e0442]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyObject_Call_Prepend+0x60) [0x4f1fd0]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x5ab347]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyObject_MakeTpCall+0x3bb) [0x4e0c8b]
/home/hltcoe/draj/.conda/envs/jsalt/bin/python(_PyEval_EvalFrameDefault+0x4fa6) [0x4dcf96]

2023-07-27 12:42:42,551 INFO [checkpoint.py:75] (0/4) Saving checkpoint to zipformer/exp_new/a0d/bad-model-0.pt
[W CUDAGuardImpl.h:124] Warning: CUDA warning: an illegal memory access was encountered (function destroyEvent)
terminate called after throwing an instance of 'c10::Error'
  what():  CUDA error: an illegal memory access was encountered
Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.

Exception raised from c10_cuda_check_implementation at ../c10/cuda/CUDAException.cpp:44 (most recent call first):
frame #0: c10::Error::Error(c10::SourceLocation, std::string) + 0x57 (0x2aab235b04d7 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10.so)
frame #1: c10::detail::torchCheckFail(char const*, char const*, unsigned int, std::string const&) + 0x64 (0x2aab2357a36b in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10.so)
frame #2: c10::cuda::c10_cuda_check_implementation(int, char const*, char const*, int, bool) + 0x118 (0x2aab23499b58 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
frame #3: <unknown function> + 0x1c36b (0x2aab2346a36b in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
frame #4: <unknown function> + 0x2b930 (0x2aab23479930 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10_cuda.so)
frame #5: <unknown function> + 0x4d46d6 (0x2aaacbeaa6d6 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_python.so)
frame #6: <unknown function> + 0x3ee77 (0x2aab23595e77 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10.so)
frame #7: c10::TensorImpl::~TensorImpl() + 0x1be (0x2aab2358e69e in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10.so)
frame #8: c10::TensorImpl::~TensorImpl() + 0x9 (0x2aab2358e7b9 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libc10.so)
frame #9: c10d::Reducer::~Reducer() + 0x254 (0x2aaad202ded4 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_cpu.so)
frame #10: std::_Sp_counted_ptr<c10d::Reducer*, (__gnu_cxx::_Lock_policy)2>::_M_dispose() + 0x12 (0x2aaacc51f4c2 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_python.so)
frame #11: std::_Sp_counted_base<(__gnu_cxx::_Lock_policy)2>::_M_release() + 0x48 (0x2aaacbd88a28 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_python.so)
frame #12: <unknown function> + 0xb4bac1 (0x2aaacc521ac1 in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_python.so)
frame #13: <unknown function> + 0x3bc54b (0x2aaacbd9254b in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_python.so)
frame #14: <unknown function> + 0x3bd4bf (0x2aaacbd934bf in /home/hltcoe/draj/.conda/envs/jsalt/lib/python3.8/site-packages/torch/lib/libtorch_python.so)
frame #15: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4d398e]
frame #16: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f96b6]
frame #17: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4e07e0]
frame #18: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4e085a]
frame #19: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f1908]
frame #20: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f1569]
frame #21: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4f152d]
frame #22: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x4ce758]
frame #23: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x506c83]
frame #24: _PyEval_EvalFrameDefault + 0x2412 (0x4da402 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #25: _PyEval_EvalCodeWithName + 0x2f1 (0x4d6fb1 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #26: _PyFunction_Vectorcall + 0x19c (0x4e807c in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #27: _PyEval_EvalFrameDefault + 0x6b2 (0x4d86a2 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #28: _PyFunction_Vectorcall + 0x106 (0x4e7fe6 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #29: _PyEval_EvalFrameDefault + 0x399 (0x4d8389 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #30: _PyEval_EvalCodeWithName + 0x2f1 (0x4d6fb1 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #31: _PyFunction_Vectorcall + 0x19c (0x4e807c in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #32: _PyEval_EvalFrameDefault + 0x1150 (0x4d9140 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #33: _PyEval_EvalCodeWithName + 0x2f1 (0x4d6fb1 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #34: PyEval_EvalCodeEx + 0x39 (0x585d79 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #35: PyEval_EvalCode + 0x1b (0x585d3b in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #36: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x5a5a91]
frame #37: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x5a4a9f]
frame #38: PyRun_StringFlags + 0x7b (0x5a24ab in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #39: PyRun_SimpleStringFlags + 0x3b (0x4509c4 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #40: Py_RunMain + 0x278 (0x5a1ad8 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #41: Py_BytesMain + 0x39 (0x579dd9 in /home/hltcoe/draj/.conda/envs/jsalt/bin/python)
frame #42: __libc_start_main + 0xf5 (0x2aaaab616445 in /lib64/libc.so.6)
frame #43: /home/hltcoe/draj/.conda/envs/jsalt/bin/python() [0x579c8d]

@pkufool
Copy link
Collaborator

pkufool commented Jul 28, 2023

OK, that change has not been fully tested yet. Thanks for your log.

@desh2608
Copy link
Collaborator

desh2608 commented Jul 31, 2023

Looks like the deletion issue has been encountered before: #420 (comment)

Update: I followed Dan's advice from the linked thread to increase the log likelihood beam, and it solved the deletion issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants