Search Results

Search found 2585 results on 104 pages for 'forensic analysis'.

Page 5/104 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Trend analysis using iterative value increments

    - by Dave Jarvis
    We have configured iReport to generate the following graph: The real data points are in blue, the trend line is green. The problems include: Too many data points for the trend line Trend line does not follow a Bezier curve (spline) The source of the problem is with the incrementer class. The incrementer is provided with the data points iteratively. There does not appear to be a way to get the set of data. The code that calculates the trend line looks as follows: import java.math.BigDecimal; import net.sf.jasperreports.engine.fill.*; /** * Used by an iReport variable to increment its average. */ public class MovingAverageIncrementer implements JRIncrementer { private BigDecimal average; private int incr = 0; /** * Instantiated by the MovingAverageIncrementerFactory class. */ public MovingAverageIncrementer() { } /** * Returns the newly incremented value, which is calculated by averaging * the previous value from the previous call to this method. * * @param jrFillVariable Unused. * @param object New data point to average. * @param abstractValueProvider Unused. * @return The newly incremented value. */ public Object increment( JRFillVariable jrFillVariable, Object object, AbstractValueProvider abstractValueProvider ) { BigDecimal value = new BigDecimal( ( ( Number )object ).doubleValue() ); // Average every 10 data points // if( incr % 10 == 0 ) { setAverage( ( value.add( getAverage() ).doubleValue() / 2.0 ) ); } incr++; return getAverage(); } /** * Changes the value that is the moving average. * @param average The new moving average value. */ private void setAverage( BigDecimal average ) { this.average = average; } /** * Returns the current moving average average. * @return Value used for plotting on a report. */ protected BigDecimal getAverage() { if( this.average == null ) { this.average = new BigDecimal( 0 ); } return this.average; } /** Helper method. */ private void setAverage( double d ) { setAverage( new BigDecimal( d ) ); } } How would you create a smoother and more accurate representation of the trend line?

    Read the article

  • ASP .NET Code analysis tool to check cross site scripting

    - by Prashant
    I am aware of a tool which MS has provided which tells you about coss site scripting attack etc. The tool is http://www.microsoft.com/downloads/details.aspx?FamilyId=0178e2ef-9da8-445e-9348-c93f24cc9f9d&displaylang=en But are there tools which you have used for ASP .NET applications which do similar to this and which one is widely used in ASP .Net applications ?

    Read the article

  • Alternatives to CAT.NET for website security analysis

    - by Gavin Miller
    I'm looking for an alternative tool to CAT.NET for performing static security scans on .NET code. Currently the CAT.NET tooling/development is at a somewhat fragile stage and doesn't offer the reliability that I'm looking for. Are there any alternative static code analyzers that you use for detecting security issues?

    Read the article

  • Fowler Analysis Patterns lately?

    - by Berryl
    As much as I've always loved this one is how much I always wished there were more meaty examples of how to apply some of the concepts available. Is anyone aware of anything out there worth looking at that attempts to that? Cheers, Berryl

    Read the article

  • Power Analysis in [R] for Two-Way Anova

    - by Thomas
    I am trying to calculate the necessary sample size for a 2x2 factorial design. I have two questions. 1) I am using the package pwr and the one way anova function to calculate the necessary sample size using the following code pwr.anova.test(k = , n = , f = , sig.level = , power = ) However, I would like to look at two way anova, since this is more efficient at estimating group means than one way anova. There is no two-way anova function that I could find. Is there a package or routine in [R] to do this? 2) Moreover, am I safe in assuming that since I am using a one-way anova power calculations, that the sample size will be more conservative (i.e. larger)?

    Read the article

  • DMX Analysis Services question

    - by user282382
    Hi, I am have two mining models, both are time series. One is [Company_Inputs] and the other is [Booking_Projections]. What I want to do is use EXTEND_MODEL_CASES to join the results of [Company_Inputs] as the extended cases. So basically something like: Select Flattened PredictTimeSeries([Bookings], 1, 6, EXTEND_MODEL_CASES) FROM [Booking_Projections] Natural Prediction Join (Select Flattened PredictTimeSeries([Metric1], 1, 6) From [Company_Inputs]) AS T This code of course doesn't work, but the idea is to use the predictions made from [Company_Inputs] as cases for predicting future values of [Booking_Projections] If anyone has an idea of how I can accomplish this I would appreciate it very much.

    Read the article

  • Visual Studio 2008 profiler analysis - missing time

    - by Scott Vercuski
    I ran the Visual Studio 2008 profiler against my ASP.NET application and came up with the following result set. CURRENT FUNCTION TIME (msec) ---------------------------------------------------|-------------- Data.GetItem(params) | 10,158.12 ---------------------------------------------------|-------------- Functions that were called by Data.GetItem(params) TIME (msec) ---------------------------------------------------|-------------- Model.GetSubItem(params) | 0.83 Model.GetSubItem2(params) | 0.77 Model.GetSubItem3(params) | 0.76 etc. The issue I'm facing is that the sum of the Functions called by Data.GetItem(params) do not sum up to the 10,158.12 msec total. This would lead me to believe that the bulk of the time is actually spent executing the code within that method. My question is ... does Visual Studio provide a way to analyze the method itself so I can see which sections of code are taking the longest? if it does not are there any recommended tools to do this? or should I start writing my own timing scripts? Thank you

    Read the article

  • Accelerometer data analysis

    - by jrrt
    Hello, I would like to know if there are some libraries/algorithms/techniques (python, if at all possible) that help to extract features from accelerometer data (extracted from and android phone, btw), like periodicity of movements, energy of acceleration and the like. Has anyone done this kind of task before? Thank you very much in advance :)

    Read the article

  • Investment advice data dump analysis

    - by portoalet
    For my year-end pet project, I'd like to analyze investment advices and their correlation to the stock market performance. The problem is, where do I get the dump of investment advice data (free) ? something like stackoverflow.com data dump will be nice. Or maybe it's easier to do distributed crawling and crawl the public finance webpages for investment advices? Investment advice is buy/sell advice for stocks/forex, issued by institution/investment advisor.

    Read the article

  • Code Coverage Analysis for Embedded C++ projects

    - by Steve Hawkins
    I have recently started working on a very large C++ project that, after completing 90% of the implementation, has determined that they need to demonstrate 100% branch coverage during testing. The project is hosted on an embedded platform (Green Hills Integrity). I'm looking for suggestions and experiences from others on StackOverflow that have used code coverage products in similar environments. I'm interested in both positive and negative comments regarding these types of tools.

    Read the article

  • Blob ID matching over multiple frames in C++ (image analysis)

    - by pollux
    Dear reader, I'm working on a blob matching and tracking library in C++. Currently I'm using openCV to detect blobs and try to match blobs in a new frame by checking the position, velocity and size of the blob. This works quite okay and I'm receiving a high blob match rate (95% or higher). Sometimes blobs fall out of the image or new blobs appear. Now I need to give matched blobs the same ID as they had before. I'm wondering if there are typical or commonly used techniques for doing this. Or even some keywords I can use to google on. Thanks

    Read the article

  • SQL Analysis Services - Dimension attributes with a "many" cardinality

    - by MonkeyBrother
    I am creating a cube with the following tables: Customer CustomerID, Name Customer Rep CustomerID, RepID Rep RepID, Name The important thing here is that there is a many to many relationship between Reps and Customers. I want to be able to ask the question "How much sales for customers working with rep 'A'?" In the data source view i set up the relationships between both customerid columns and both repid columns. I set up the rep attribute in the dimension builder and when I try to build the cube I get this error: Errors in the high-level relationship engine. the 'Rep' table that is required for a join cannot be reached based on the relationships in the data source view.

    Read the article

  • Text editor with "forensic" capabilities?

    - by Timo
    This is what happened: I wrote a perl script using TextWrangler and managed to change the encoding to UTF8 BOM, which inserts te BOM marker at the start of the file. Perl promptly misses the #! and mayhem ensues. It then takes me the better part of an afternoon to figure this out since most text editors do not show the BOM marker even with various "show invisibles" options turned on. Now, I've learned my lesson, I should have used less immediately, etc. etc.. What I'm wondering though is whether there is a text editor out there that lets you see every single byte of the file, even if they are "invisible"?

    Read the article

  • Analysis Services with excel as front end - is it possible to get the nicer UI that powerpivot provi

    - by AJM
    I have been looking into PowerPivot and concluded that for "self service BI" and ahoc buidling of cubes it has its uses. In particular I like the enhanced UI that you get from using PowerPivot rather than just using a PivotTable hooked up to an analysis services datasource. However it seems that hooking up PowerPivot to an existing analysis services cube is not a solution for "organisational BI". It is not always desireable to suck millions of rows into excel at once and the interface between PowerPivot and analysis services is very poor in my book. Hence the question is can an existing analysis services solution get the enhanced ui features that power pivot brings, withoout using powerpivot as the design tool? If powerpivot is aimed ad self service/personal BI then it seems bizare that the UI for this is better than for bigger/more costly analysis services solutions.

    Read the article

  • Specify a custom dictionary for FxCop and Visual Studio source analysis

    - by Marko Apfel
    Renaming the default custom dictionary from CustomDictionary.xml to an other name – for instance FxCop.CustomDictionary.xml needs some additional changes to work in involved applications. Visual Studio Team System code analysis For Visual Studio Team System code analysis this file should be added as a link to all projects and setted to be the Build Action CodeAnalysisDirectory. Build target In a build target the command line tool FxCopCmd should be called with the /dictionary parameter: <Target Name="FxCop"> <Exec Command="&quot;$(ProjectDir)..\..\build\FxCop\FxCopCmd.exe&quot; /file:&quot;$(TargetPath)&quot; /project:&quot;$(ProjectDir)..\EsriDE.SfgPraxair.FxCop&quot; /directory:&quot;$(ProjectDir)..\..\lib\Esri.ArcGIS&quot; /directory:&quot;$(ProjectDir)..\..\lib\Microsoft&quot; /dictionary:&quot;$(ProjectDir)..\FxCop.CustomDictionary.xml&quot; /out:&quot;$(OutDir)..\$(ProjectName).FxCopReport.xml&quot; /console /forceoutput /ignoregeneratedcode"> </Exec> <Message Text="FxCop finished." /> </Target> FxCop-GUI (standalone application) In FxCop-GUI is no option to specify an own file name – but you could add a hint in the FxCop project file. Open your this file and look for the line: <CustomDictionaries SearchFxCopDir="True" SearchUserProfile="True" SearchProjectDir="True" /> Then change it to: <CustomDictionaries SearchFxCopDir="True" SearchUserProfile="True" SearchProjectDir="True"> <CustomDictionary Path="FxCop.CustomDictionary.xml"/> </CustomDictionaries> Ready :-)

    Read the article

  • Great Example of a Simple Cost-Benefit Analysis

    - by BuckWoody
    I saw a post the other day that you should definitely go check out. It’s a cost/benefit decision, and although the author gives it a quick treatment and doesn’t take all points in the decision into account, you should focus on the process he follows. It’s a quick and simple example of the kind of thought process we should have as data professionals when we pick a server, a process, or application and even platform software. The key is to include more than just the price of a piece of software or hardware. You need to think about the “other” costs in the decision, and then make the right one. Sometimes the cheapest option is the cheapest, and other times, well, it isn’t. I’ve seen this played out not only in the decision to go with a certain selection, but in the options or editions it comes in. You have to put all of the decision points in the analysis to come up with the right answer, and you have to be able to explain your logic to your team and your company. This is the way you become a data professional, not just a DBA. You can check out the post here – it deals with Azure, but the point is the process, not Azure itself: http://blogs.msdn.com/eugeniop/archive/2010/03/19/windows-azure-guidance-a-simplistic-economic-analysis-of-a-expense-migration.aspx Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Analysis of nopCommerce

    - by chanva
    More and more medium-sized and small enterprises would like eCommerce website to sell their products or services.  Free and open source project should be the first choice.  I found out the nopCommerce is a good option, you could see the detailed analysis.

    Read the article

  • Using Keyword Analysis to Write Articles and Blogs

    Keyword analysis is a process by which you can discover what search phases are used at search engines by users for find information. Keywords are nothing but search words or phrases entered by users at search engines like Google, Yahoo and Bing. For article, blog and web content writers, keyword research is the most important part of the process.

    Read the article

  • Automating SQL Execution Plan analysis

    - by jchang
    Last year, I made my tool for automating execution plan analysis available on www.qdpma.com The original version could parse execution plans from sys.dm_exec_query_stats or dm_exec_cached_plans and generate a cross-reference of which execution plans employed each index. The DMV sys.dm_db_index_usage_stats shows how often each index is used, but not where, that is, which particular stored procedure or My latest version can now also 1) use the DMV sys.dm_exec_procedure_stats, 2) it can also get the...(read more)

    Read the article

  • Free SEO Analysis using IIS SEO Toolkit

    - by The Official Microsoft IIS Site
    In my spare time I’ve been thinking about new ideas for the SEO Toolkit , and it occurred to me that rather than continuing trying to figure out more reports and better diagnostics against some random fake sites, that it could be interesting to ask openly for anyone that is wanting a free SEO analysis report of your site and test drive some of it against real sites. So what is in it for you, I will analyze your site to look for common SEO errors, I will create a digest of actions to do and other...(read more)

    Read the article

  • New binary analysis tool finds FOSS in device firmware

    <b>ars Technica:</b> "Software development company Loohuis Consulting and process management consultancy OpenDawn have released a new binary analysis tool that is designed to detect Linux and BusyBox in binary firmware. The program, which is freely available for download, is intended to aid open source license compliance efforts."

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >