Search Results

Search found 4 results on 1 pages for 'aqtime'.

Page 1/1 | 1 

  • Why AQTime slows execution even when profiling is not on, and can anything be done for it?

    - by Antti Suni
    Hi! In AQTime for Delphi, it boasts to be very fast to get to the trouble spots by using areas and triggers etc. But it seems to me, that especially if you have very much code in the areas to profile, then the execution slows down dramatically even when the profiling is NOT on. For example, if I want to profile a specific routine late in the program flow, but don't know what is called there, I'd think to put this routine only as a trigger and the initial status for threads as Off, and then choose "Full check by Routines/Lines". However, when I do this, the program execution slows down heavily already before the trigger routine has ever been hit. For example if the "preparation flow" takes around 5 minutes without AQTime, then when I run it with profiling disabled, it already has been running for 30 minutes and still goes even when I know the trigger has not yet even been reached. I know I can try to workaround this by reducing the amount of routines/lines profiled, but it is not really a good solution for me, since I'd like to profile all of them once I get to the actual trigger routine. Also another, often better workaround is to start the application without AQTime and then use Attach to Process after the "preparation flow" has finished, but this works well only when the execution pauses in GUI in the proper place or otherwise provides a suitable time frame for doing the attaching. In all cases this is not the case. Any comments on why this is so and is there anything else to do than just try to reduce the code from the areas or attach later to the process?

    Read the article

  • VS2010 profiler/leak detection

    - by Noah Roberts
    Anyone know of a profiler and leak detector that will work with VS2010 code? Preferably one that runs on Win7. I've searched here and in google. I've found one leak detector that works (Memory Validator) but I'm not too impressed. For one thing it shows a bunch of menu leaks and stuff which I'm fairly confident are not real. I also tried GlowCode but it's JUST a profiler and refuses to install on win7. I used to use AQtime. It had everything I needed, memory/resource leak detection, profiling various things, static analysis, etc. Unfortunately it gives bogus results now. My main immediate issue is that VS2010 is saying there are leaks in a program that had none in VS2005. I'm almost certain it's false positives but I can't seem to find a good tool to verify this. Memory Validator doesn't show the same ones and the reporting of leaks from VS doesn't seem rational.

    Read the article

  • Make process crash on large memory allocation

    - by Pieter
    I'm trying to find a significant memory leak (15MB at a time, but doing allocations like this on multiple places). I checked the most obvious places, and then used AQTime, but I still can't pinpoint it. Now I see 2 options left: 1) Use SetProcessWorkingSetSize: I've tried this but my process happily keeps on running when using up more then 150MB: DWORD MemorySize = 150*1024*1024; SetProcessWorkingSetSize( GetCurrentProcess(), MemorySize/2, MemorySize*2 ); 2) Put a breakpoint when allocating more then 1MB at a time. How should I do this, overload operator new with an 'if1MB' inside ?

    Read the article

  • std::vector optimisation required

    - by marcp
    I've written a routine that uses std::vector<double> rather heavily. It runs rather slowly and AQTime seems to imply that I am constructing mountains of vectors but I'm not sure why I would be. For some context, my sample run iterates 10 times. Each iteration copies 3 c arrays of ~400 points into vectors and creates 3 new same sized vectors for output. Each output point might be the result of summing up to 20 points from 2 of the input vectors, which works out to a worst case of 10*400*3*2*20 = 480,000 dereferences. Incredibly the profiler indicates that some of the std:: methods are being called 46 MILLION times. I suspect I'm doing something wrong! Some code: vector<double>gdbChannel::GetVector() { if (fHaveDoubleData & (fLength > 0)) { double * pD = getDoublePointer(); vector<double>v(pD, pD + fLength); return v; } else { throw(Exception("attempt to retrieve vector on empty line")); ; } } void gdbChannel::SaveVector(GX_HANDLE _hLine, const vector<double> & V) { if (hLine != _hLine) { GetLine(_hLine, V.size(), true); } GX_DOUBLE * pData = getDoublePointer(); memcpy(pData, &V[0], V.size()*sizeof(V[0])); ReplaceData(); } ///This routine gets called 10 times bool SpecRatio::DoWork(GX_HANDLE_PTR pLine) { if (!(hKin.GetLine(*pLine, true) && hUin.GetLine(*pLine, true) && hTHin.GetLine(*pLine, true))) { return true; } vector<double>vK = hKin.GetVector(); vector<double>vU = hUin.GetVector(); vector<double>vTh = hTHin.GetVector(); if ((vK.size() == 0) || (vU.size() == 0) || (vTh.size() == 0)) { return true; } ///TODO: confirm all vectors the same lenghth len = vK.size(); vUK.clear(); // these 3 vectors are declared as private class members vUTh.clear(); vThK.clear(); vUK.reserve(len); vUTh.reserve(len); vThK.reserve(len); // TODO: ensure everything is same fidincr, fidstart and length for (int i = 0; i < len; i++) { if (vK.at(i) < MinK) { vUK.push_back(rDUMMY); vUTh.push_back(rDUMMY); vThK.push_back(rDUMMY); } else { vUK.push_back(RatioPoint(vU, vK, i, UMin, KMin)); vUTh.push_back(RatioPoint(vU, vTh, i, UMin, ThMin)); vThK.push_back(RatioPoint(vTh, vK, i, ThMin, KMin)); } } hUKout.setFidParams(hKin); hUKout.SaveVector(*pLine, vUK); hUTHout.setFidParams(hKin); hUTHout.SaveVector(*pLine, vUTh); hTHKout.setFidParams(hKin); hTHKout.SaveVector(*pLine, vThK); return TestError(); } double SpecRatio::VValue(vector<double>V, int Index) { double result; if ((Index < 0) || (Index >= len)) { result = 0; } else { try { result = V.at(Index); if (OasisUtils::isDummy(result)) { result = 0; } } catch (out_of_range) { result = 0; } } return result; } double SpecRatio::RatioPoint(vector<double>Num, vector<double>Denom, int Index, double NumMin, double DenomMin) { double num = VValue(Num, Index); double denom = VValue(Denom, Index); int s = 0; // Search equalled 10 in this case while (((num < NumMin) || (denom < DenomMin)) && (s < Search)) { num += VValue(Num, Index - s) + VValue(Num, Index + s); denom += VValue(Denom, Index - s) + VValue(Denom, Index + s); s++; } if ((num < NumMin) || (denom < DenomMin)) { return rDUMMY; } else { return num / denom; } } The top AQTime offenders are: std::_Uninit_copy , double *, std::allocator 3.65 secs and 115731 Hits std::_Construct 1.69 secs and 46450637 Hits std::_Vector_const_iterator ::operator !=1.66 secs and 46566395 Hits and so on... std::allocator<double>::construct, operator new, std::_Vector_const_iterator<double, std::allocator<double> >::operator ++, std::_Vector_const_iterator<double, std::allocator<double> >::operator * std::_Vector_const_iterator<double, std::allocator<double> >::operator == each get called over 46 million times. I'm obviously doing something wrong to cause all these objects to be created. Can anyone see my error(s)?

    Read the article

1