Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for reloading with shared modules #1091

Merged
merged 2 commits into from Sep 23, 2020
Merged

Fix for reloading with shared modules #1091

merged 2 commits into from Sep 23, 2020

Conversation

dl748
Copy link
Contributor

@dl748 dl748 commented Sep 5, 2020

Updating previous hot reloading fix.

Main issue is that the clear-modules indiscriminately deleted all child modules of the module being deleted.

I pulled in the clear-module code (due to the author not accepting my PR for adding filtering), and modified the code to only remove children that are non-'node_modules' and then running a cleanup that deletes all modules who's parent was removed from the list. Basically, node_modules that are ONLY used in the handler.

One of the issues is that if a handler or submodule of the handler, use say aws-sdk, it would remove aws-sdk from memory, even though serverless and serverless-offline were still using it. This should prevent that, and any future module that serverless/serverless-offline uses from being unloaded when being used.

This also fixes the async test, I initially thought the issue was due to recursive nature of the test.

@kelchm
Copy link
Contributor

kelchm commented Sep 5, 2020

For me this seems to introduce a memory leak when using the default options for serverless-offline.

(node:99903) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 exit listeners added to [process]. Use emitter.setMaxListeners() to increase limit
(node:99903) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGINT listeners added to [process]. Use emitter.setMaxListeners() to increase limit
(node:99903) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 SIGTERM listeners added to [process]. Use emitter.setMaxListeners() to increase limit

After enough requests the inevitable occurs:

<--- Last few GCs --->

[99639:0x102d52000]   532426 ms: Mark-sweep 2020.1 (2086.8) -> 2010.8 (2090.2) MB, 1414.7 / 1.7 ms  (average mu = 0.138, current mu = 0.070) allocation failure GC in old space requested
[99639:0x102d52000]   534082 ms: Mark-sweep 2016.7 (2090.2) -> 2012.0 (2091.8) MB, 1551.4 / 2.2 ms  (average mu = 0.099, current mu = 0.063) allocation failure GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x05972c5408d1 <JSObject>
    0: builtin exit frame: compileFunction(this=0x059720e82d81 <JSGlobal Object>,0x0597fed61b01 <JSArray[5]>,0x0597fed61ae1 <JSArray[0]>,0x05972aa804b1 <undefined>,0x05972aa806e9 <false>,0x05972aa804b1 <undefined>,0,0,0x0597ea3a18d9 <String[#101]: /Users/kelchm/Development/Modus Create/Payactiv/Engage/server-api/node_modules/aws-sdk/clients/all.js>,0x0597fed65a61 <Very long string[9166]>,0x05972...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0x1011c2ff5 node::Abort() (.cold.1) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 2: 0x10009fbc9 node::Abort() [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 3: 0x10009fd2f node::OnFatalError(char const*, char const*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 4: 0x1001e3907 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 5: 0x1001e38a7 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 6: 0x1003695e5 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 7: 0x10036ae3a v8::internal::Heap::RecomputeLimits(v8::internal::GarbageCollector) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 8: 0x1003678be v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
 9: 0x100365670 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
10: 0x10037149a v8::internal::Heap::AllocateRawWithLightRetry(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
11: 0x100371521 v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
12: 0x10033e1ad v8::internal::Factory::NewFixedArrayWithFiller(v8::internal::RootIndex, int, v8::internal::Object, v8::internal::AllocationType) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
13: 0x100460e7a v8::internal::interpreter::ConstantArrayBuilder::ToFixedArray(v8::internal::Isolate*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
14: 0x10043fe04 v8::internal::interpreter::BytecodeArrayWriter::ToBytecodeArray(v8::internal::Isolate*, int, int, v8::internal::Handle<v8::internal::ByteArray>) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
15: 0x100442b4d v8::internal::interpreter::BytecodeGenerator::FinalizeBytecode(v8::internal::Isolate*, v8::internal::Handle<v8::internal::Script>) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
16: 0x10046437f v8::internal::interpreter::InterpreterCompilationJob::FinalizeJobImpl(v8::internal::Handle<v8::internal::SharedFunctionInfo>, v8::internal::Isolate*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
17: 0x1002ab79a v8::internal::(anonymous namespace)::FinalizeUnoptimizedCompilationJob(v8::internal::UnoptimizedCompilationJob*, v8::internal::Handle<v8::internal::SharedFunctionInfo>, v8::internal::Isolate*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
18: 0x1002a7c83 v8::internal::(anonymous namespace)::CompileToplevel(v8::internal::ParseInfo*, v8::internal::Isolate*, v8::internal::IsCompiledScope*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
19: 0x1002aa0d3 v8::internal::Compiler::GetWrappedFunction(v8::internal::Handle<v8::internal::String>, v8::internal::Handle<v8::internal::FixedArray>, v8::internal::Handle<v8::internal::Context>, v8::internal::Compiler::ScriptDetails const&, v8::ScriptOriginOptions, v8::internal::ScriptData*, v8::ScriptCompiler::CompileOptions, v8::ScriptCompiler::NoCacheReason) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
20: 0x1001eea97 v8::ScriptCompiler::CompileFunctionInContext(v8::Local<v8::Context>, v8::ScriptCompiler::Source*, unsigned long, v8::Local<v8::String>*, unsigned long, v8::Local<v8::Object>*, v8::ScriptCompiler::CompileOptions, v8::ScriptCompiler::NoCacheReason, v8::Local<v8::ScriptOrModule>*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
21: 0x10009313e node::contextify::ContextifyContext::CompileFunction(v8::FunctionCallbackInfo<v8::Value> const&) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
22: 0x10024f438 v8::internal::FunctionCallbackArguments::Call(v8::internal::CallHandlerInfo) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
23: 0x10024e9f9 v8::internal::MaybeHandle<v8::internal::Object> v8::internal::(anonymous namespace)::HandleApiCallHelper<false>(v8::internal::Isolate*, v8::internal::Handle<v8::internal::HeapObject>, v8::internal::Handle<v8::internal::HeapObject>, v8::internal::Handle<v8::internal::FunctionTemplateInfo>, v8::internal::Handle<v8::internal::Object>, v8::internal::BuiltinArguments) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
24: 0x10024e162 v8::internal::Builtin_Impl_HandleApiCall(v8::internal::BuiltinArguments, v8::internal::Isolate*) [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
25: 0x1009d3ab9 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvOnStack_BuiltinExit [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
26: 0x100959364 Builtins_InterpreterEntryTrampoline [/Users/kelchm/.nvm/versions/node/v12.18.3/bin/node]
[1]    99638 abort      npm start

Additionally, as raised in #1074, there will still be a significant performance penalty on every request for any functions that need to do setup (establishing a database connection, retrieving credentials, etc) outside of the handler itself. I've measured this to average about 8 times slower on one of my projects. In general I think it's great to have the allowCache option available, but at this point it seems to come with too many side affects for allowCache: false to be the default.

@dl748
Copy link
Contributor Author

dl748 commented Sep 5, 2020

That is not a memory leak. I ran a test of one of my own projects, with over 4000 endpoints, calling each endpoint 50,000 times, the memory usage fluctuated between 1.5 gigs and 4 gigs of ram. Note: for this project, i have to run it with max_old_space_size=8192, but i've had to do that since version 5.1 due to the memory usage of the project.

Its impossible to 100% detect a memory leak with a garbage collecting system which is why is says possible, where your comment specifies certainty.

The reason is due to the way nodejs works, clearing out a module does not IMMEDIATELY free the memory.

This is EASILY provable by adding a GC run between calls to the clearModule function i added. Running it the same, never went above 1.1 gigs usage. Unfortunately, this causes 2 issues. It needs to be run with the 'expose_gc' flag to node, and the global.gc() is VERY expensive, and increases handler calls by 1-1.5 seconds.

If it was a REAL memory leak, the GC would see it as being USED memory, and keep it alive.

Note: serverless-webpack plugin horribly steals memory. This is a long standing issue with serverless-webpack -> serverless-heaven/serverless-webpack#299

@dl748
Copy link
Contributor Author

dl748 commented Sep 5, 2020

Note: the reason you didn't see an issue with the 6+ branch, is because it was not doing ANY reloading of the code during that time. So there wouldn't be any memory changes.

@kelchm
Copy link
Contributor

kelchm commented Sep 5, 2020

For what it's worth, I'm not using package: invididually: true with serverless-webpack. While I don't doubt there the fact there is an underlying issue there, I've never previously encountered it until testing these changes.

Not trying to be contentious, but I think that unless allowCache: true becomes the default again many serverless-offline users are going to continue to encounter issues with things that worked perfectly fine in previous versions. I understand that comes at the cost of hot reloading working by default, but couldn't that be addressed by adding some documentation about how to enable it in the README?

@dl748
Copy link
Contributor Author

dl748 commented Sep 5, 2020

For what it's worth, I'm not using package: invididually: true with serverless-webpack. While I don't doubt there the fact there is an underlying issue there, I've never previously encountered it until testing these changes.

Thats just that instance, there are other tickets. There are a lot of tickets where the solution was to "just increase memory". I've had my own issues with serverless-webpack and had to increase the memory, even way before my fix. Esp when used with serverless-offline.

Not trying to be contentious, but I think that unless allowCache: true becomes the default again many serverless-offline users are going to continue to encounter issues with things that worked perfectly fine in previous versions. I understand that comes at the cost of hot reloading working by default, but couldn't that be addressed by adding some documentation about how to enable it?l in the README?

I don't believe something that specific to developers should have hot reloading off by defaults. Also, while you are having that issue, it was already problem pre 6, it only "went away" because "hot reloading" was broken. Reintroducing an error that was never "fixed" in the first place doesn't seem like a problem to me.

I think rather than that, I think time should be better spent on finding a better solution to the problem. With that in mind, I think there should be a way for plugins to notify serverless-offline when content changes. Then serverless-offline makes a determination of when content should be "flushed" based off of a flag. We can then tell serverless-webpack to implement it, or you could submit a patch. There is no good solution without both sides having code to handle it.

@dherault dherault merged commit ca3e396 into dherault:master Sep 23, 2020
@dherault
Copy link
Owner

Thanks @dl748 I agree the value of your feature is greater than a memory leak, we'll find someone to fix it. v6.8.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants