Search Results

Search found 164 results on 7 pages for 'breakdown'.

Page 4/7 | < Previous Page | 1 2 3 4 5 6 7  | Next Page >

  • Is there a log file analyzer for log4j files?

    - by Juha Syrjälä
    I am looking for some kind of analyzer tool for log files generated by log4j files. I am looking something more advanced than grep? What are you using for log file analysis? I am looking for following kinds of features: The tool should tell me how many time a given log statement or a stack trace has occurred, preferably with support for some kinds of patterns (eg. number of log statements matching 'User [a-z]* logged in'). Breakdowns by log level (how many INFO, DEBUG lines) and by class that initiated the log message would be nice. Breakdown by date (how many log statements in given time period) What log lines occur commonly together? Support for several files since I am using log rolling Hot spot analysis: find if there is a some time period when there is unusually high number of log statements Either command-line or GUI are fine Open Source is preferred but I am also interested in commercial offerings My log4j configuration uses org.apache.log4j.PatternLayout with pattern %d %p %c - %m%n but that could be adapted for analyzer tool.

    Read the article

  • ASP.NET MVC View Engine Comparison

    - by McKAMEY
    EDIT: added a community wiki to begin capturing people's experience with various View Engines. Please respectfully add any experiences you've had. I've been searching on SO & Google for a breakdown of the various View Engines available for ASP.NET MVC, but haven't found much more than simple high-level descriptions of what a view engine is. I'm not necessarily looking for "best" or "fastest" but rather some real world comparisons of advantages / disadvantages of the major players (e.g. the default WebFormViewEngine, MvcContrib View Engines, etc.) for various situations. I think this would be really helpful in determining if switching from the default engine would be advantageous for a given project or development group. Has anyone encountered such a comparison?

    Read the article

  • magento - multiple tax rates

    - by Fiona
    I've built a magento site for a Canadian company where each state has a Retail sales tax and a federal tax rate and these rates differ! So, I set up the rates, and the Tax is being calculated correctly, ie the sum of the two tax rates is being calculated correctly. however on selecting the breakdown of tax mode in the admin panel, it would appear that there is something wrong with my setup. Eg. Subtotal $129.99 GST/HST Quebec (5%) $ 16.25 Provincial Sales Tax Quebec (7.5%) Tax $ 16.25 Grand Total $158.23 16.25 is 12% of $129.99 so the tax figure is correct. However it should be displaying as follows: Subtotal $129.99 GST/HST Quebec (5%) $ 6.50 Provincial Sales Tax Quebec (7.5%) $ 9.75 Tax $ 16.25 Grand Total $158.23 Anyone come across this before? Have any suggestions on how to fix it? Many thanks, Fiona

    Read the article

  • VB.net 'cross-referencing' tables.

    - by Lee
    Hello, I have a DataGridView that is being filled with data from a table. Inside this table is a column called 'group' that has the ID of an individual group in another table. What I would like to do, is when the DataGridView is filled, instead of showing the ID contained in 'group', I'd like it to display the name of the group. Is there some type of VB.net 'magic' that can do this, or do I need to cross-reference the data myself? Here is a breakdown of what the 2 tables look like: table1 id group (this holds the value of column id in table 2) weight last_update table2 id description (this is what I would like to be displayed in the DGV.) take care, lee BTW - I am using Visual Studios Express.

    Read the article

  • Eclipse Java Profiler

    - by Jeff Storey
    I know this has been asked before, but I have not found anything recent that really gives a good answer. I'm trying to find a free profiler for eclipse that works well. I would like a graphical breakdown of execution time in particular. I've tried TPTP but have had no luck at all with GUI apps (it took almost a minute for a GUI app to start and was virtually unusable on screen - it uses a lot of Java OpenGL, so I'm not sure if it has to do with that). I liked YourKit, but unfortunately it's not free. I even tried switching to NetBeans since they have a built in profiler. If anyone has had success with particular profilers (even if it was TPTP), I'd like to hear about it. Any recommendations would be greatly appreciated. thanks, Jeff

    Read the article

  • Odd GROUP BY output DB2 - Results not as expected

    - by CallCthulhu
    If I run the following query: select load_cyc_num , crnt_dnlq_age_cde , sum(cc_min_pymt_amt) as min_pymt , sum(ec_tot_bal) as budget , case when ec_tot_bal 0 then 'Y' else 'N' end as budget , case when ac_stat_cde in ('A0P','A1P','ARP','A3P') then 'Y' else 'N' end as arngmnt , sum(sn_close_bal) as st_bal from statements where (sn_close_bal 0 or ec_tot_bal 0) and load_cyc_num in (200911) group by load_cyc_num , crnt_dnlq_age_cde , case when ec_tot_bal 0 then 'Y' else 'N' end , case when ac_stat_cde in ('A0P','A1P','ARP','A3P') then 'Y' else 'N' end then I get the correct "BUDGET" grouping, but not the correct "ARRANGEMENT" grouping, only two rows have a "Y". If I change the order of the case statements in the GROUP BY, then I get the correct grouping (full Y-N breakdown for both columns). Am I missing something obvious?

    Read the article

  • Action Controller: Exception - ID not found

    - by Danny McClelland
    Hi Everyone, I am slowly getting the hang of Rails and thanks to a few people I now have a basic grasp of the database relations and associations etc. You can see my previous questions here: http://stackoverflow.com/questions/2714621/rails-database-relationships I have setup my applications models with all of the necessary has_one and has_many :through etc. but when I go to add a kase and choose from a company from the drop down list - it doesnt seem to be assigning the company ID to the kase. You can see a video of the the application and error here: http://screenr.com/BHC You can see a full breakdown of the application and relevant source code at the Git repo here: http://github.com/dannyweb/surveycontrol If anyone could shed some light on my mistake I would be appreciate it very much! Thanks, Danny

    Read the article

  • How to quickly analyse large MDB file

    - by Craig Johnston
    I need to know how to quickly analyse a large MDB file (about 1GB) to see which tables are causing it to be so big. Is there something will easily allow me to show a breakdown of which tables are responsible for how much data. I need to know whether it is just this one customer who is using the application differently, or whether there is genuinely a lot of data in the MDB. This MDB is currently causing the VB app to crash, and I need to know why it is so big so that I can maybe think about how to put some of the data into another 'archival' MDB. Migrating to SQL Server is not an option, unless the use of linked tables from an MDB is a realistic option.

    Read the article

  • Speeding up jQuery empty() or replaceWith() Functions When Dealing with Large DOM Elements

    - by Levi Hackwith
    Let me start off by apologizing for not giving a code snippet. The project I'm working on is proprietary and I'm afraid I can't show exactly what I'm working on. However, I'll do my best to be descriptive. Here's a breakdown of what goes on in my application: User clicks a button Server retrieves a list of images in the form of a data-table Each row in the table contains 8 data-cells that in turn each contain one hyperlink Each request by the user can contain up to 50 rows (I can change this number if need be) That means the table contains upwards of 800 individual DOM elements My analysis shows that jQuery("#dataTable").empty() and jQuery("#dataTable).replaceWith(tableCloneObject) take up 97% of my overall processing time and take on average 4 - 6 seconds to complete. I'm looking for a way to speed up either of the above mentioned jQuery functions when dealing with massive DOM elements that need to be removed / replaced. I hope my explanation helps.

    Read the article

  • Alternatives to MS project server

    - by Ajaxx
    I manage a small group and I'd keep my work breakdown in project. However, it's difficult to provide my team with an adequate view into the project and ability to report on their progress. I looked at MS Project Server (the sharepoint webpart) but it's an expensive proposition. Has anyone had any experience with any other tool (commercial is fine) that helps team view and report on their work as managed by MS Project? FWIW, I have looked at OpenProj and it appears to be a decent solution for viewing project files on the desktop. Anything web-based, keeping in mind that I'd like people to report on their work not just view their work.

    Read the article

  • MVC-3 User-Image Management - Best Practices

    - by Rob
    Hello Experts, Developing using MVC-3, Razor, C# Been searching around and cannot find advice I'm looking for. My site will contain user-uploaded images (possibly a high number). What is the best practice for managing these pictures (placement, breakdown into sub-folders, etc...)? Where do I place them that will prevent them from getting accidentally blown away if I republish my site periodically? If there are any good articles or blog posts, that would be helpful. Also, any advice/tips anyone wants to add would be great. Thanks for your time! Rob EDIT Also would like to know what people do to prevent hot linking.

    Read the article

  • Tracing\profiling instructions

    - by LeChuck2k
    Hi Y'all. I'd like to statistically profile my C code at the instruction level. I need to know how many additions, multiplications, devides, etc,... I'm performing. This is not your usual run of the mill code profiling requirement. I'm an algorithm developer and I want to estimate the cost of converting my code to hardware implementations. For this, I'm being asked the instruction call breakdown during run-time (parsing the compiled assembly isn't sufficient as it doesn't consider loops in the code). After looking around, It seems VMWare may offer a possible solution, but I still couldn't find the specific feature that will allow me to trace the instruction call stream of my process. Are you aware of any profiling tools which enable this?

    Read the article

  • ASP.net Treeview/Listview combination or alternative: Tutorials? Help?

    - by jlrolin
    I need to create an ASP.net page that has a control on the page that has a five-level TreeView on the left side of the page, and accounting balances on the right side the coincide with each breakdown in the tree. Top level is company, next is group, next is program, etc... and the balances break down accordingly. I've seen that there are controls out there such as TreeView/ListView combination controls that can do this. Is there any tutorials or help out there on how to go about accomplishing this without paying for controls? Could a treeview do this alone by spanning data across the entire length of the columns since every level will have totals on it?

    Read the article

  • google search rankings and trends api

    - by Drakkhen
    I'm looking to find an api or program or interface to get the following information. I would like to see the amount of times a particular search term was used and its /weekly/monthly/yearly breakdown along with its rank in a particular page. I've found googlesearchpositionfinder.com and google.com/trends but i have 5000 terms to search for by hand is not happening. I've also found www.juiceanalytics.com/openjuice/programmatic-google-trends-api but it doesn't allow me to do a break down for a period of 2 years. Basically i'm trying to create a ranking of search phrases, the weeks(period) they were more popular and how a particular site(e.g urban dictionary) showed up in googles search rankings for the terms.

    Read the article

  • What did they program this toy with?

    - by Trix
    A rather strange question: I'm often asking myself with what programming languages things were created. I recently found this toy mini computer I played with when I was 13 or so at home. (Note: It is not one of those toy "notebooks", it's really small and came as an extra with a magazine) "Features": Hadware: LCD with a small field of pixels where the games were going on, besides that some stats such as score, highscore etc. Sounds and horrible music when started A really small "keyboard" with a wire Software: At least 14 or so games, from Snake over Tetris and Breakdown to some abomination of a car racing game A calculator Game selecting menu An alarm clock Inside there is a really small circuit board, I don't want to open the thing up now, though. Can you imagine if the games and "Operating System" of this thing where actually programmed using a language? If yes, what language could it be? If not with a programming language, how else was it created?

    Read the article

  • 'cross-referencing' DataTable's

    - by Lee
    I have a DataGridView that is being filled with data from a table. Inside this table is a column called 'group' that has the ID of an individual group in another table. What I would like to do, is when the DataGridView is filled, instead of showing the ID contained in 'group', I'd like it to display the name of the group. Is there some type of VB.net 'magic' that can do this, or do I need to cross-reference the data myself? Here is a breakdown of what the 2 tables look like: table1 id group (this holds the value of column id in table 2) weight last_update table2 id description (this is what I would like to be displayed in the DGV.) BTW - I am using Visual Studio Express.

    Read the article

  • Help with MySQL and CASE WHEN with a range of values

    - by kickdaddy
    I have an accounts table and a records table where accounts have multiple records. I would like to break down the account totals by "count of records" range. i.e. show the breakdown of Count of Records | Count 0-25 | 100 25 - 50 | 122 50 - 100 | 300 etc. I am using the following query, but I can't get it to group by "grp" which is what I want, any help on the best way to modify query? Thanks! SELECT count(*) as ct, CASE WHEN COUNT(*) < 25 THEN '1-25' WHEN COUNT(*) >= 25 < 50 THEN '25-50' WHEN COUNT(*) >= 50 < 100 THEN '50-100' WHEN COUNT(*) >= 100 < 250 THEN '100-250' WHEN COUNT(*) >= 250 < 500 THEN '250-500' WHEN COUNT(*) >= 500 < 1000 THEN '500-1000' ELSE '1000+' END AS grp FROM records r,accounts a WHERE r.account_id=a.id ORDER BY ct

    Read the article

  • Making interactive touch objects on Android

    - by Greenhouse_Gases
    I've never built a game before, and I've not programmed for Android before but am looking to do so over the summer by building a game. What type of object do I use for a shape that I want the user to be able to drag around the screen for instance using touch gestures? How do I tie together the MotionEvent, View and Graphics2D to make objects drawn on screen that can be interacted with? I imagine this will use ActionListeners / Handlers but I'm a bit confused at this stage... A simple breakdown of steps would be much appreciated. Thanks

    Read the article

  • Is your test method self-validating ?

    - by mehfuzh
    Writing state of art unit tests that can validate your every part of the framework is challenging and interesting at the same time, its like becoming a samurai. One of the key concept in this is to keep our test synced all the time as underlying code changes and thus breaking them to the furthest unit as possible.  This also means, we should avoid  multiple conditions embedded in a single test. Let’s consider the following example of transfer funds. [Fact] public void ShouldAssertTranserFunds() {     var currencyService = Mock.Create<ICurrencyService>();     //// current rate     Mock.Arrange(() => currencyService.GetConversionRate("AUS", "CAD")).Returns(0.88f);       Account to = new Account { Currency = "AUS", Balance = 120 };     Account from = new Account { Currency = "CAD" };       AccountService accService = new AccountService(currencyService);       Assert.Throws<InvalidOperationException>(() => accService.TranferFunds(to, from, 200f));       accService.TranferFunds(to, from, 100f);       Assert.Equal(from.Balance, 88);     Assert.Equal(20, to.Balance); } At first look,  it seems ok but as you look more closely , it is actually doing two tasks in one test. At line# 10 it is trying to validate the exception for invalid fund transfer and finally it is asserting if the currency conversion is successfully made. Here, the name of the test itself is pretty vague. The first rule for writing unit test should always reflect to inner working of the target code, where just by looking at their names it is self explanatory. Having a obscure name for a test method not only increase the chances of cluttering the test code, but it also gives the opportunity to add multiple paths into it and eventually makes things messy as possible. I would rater have two test methods that explicitly describes its intent and are more self-validating. ShouldThrowExceptionForInvalidTransferOperation ShouldAssertTransferForExpectedConversionRate Having, this type of breakdown also helps us pin-point reported bugs easily rather wasting any time on debugging for something more general and can minimize confusion among team members. Finally, we should always make our test F.I.R.S.T ( Fast.Independent.Repeatable.Self-validating.Timely) [ Bob martin – Clean Code]. Only this will be enough to ensure, our test is as simple and clean as possible.   Hope that helps

    Read the article

  • Advice Needed: Developers blocked by waiting on code to merge from another branch using GitFlow

    - by fogwolf
    Our team just made the switch from FogBugz & Kiln/Mercurial to Jira & Stash/Git. We are using the Git Flow model for branching, adding subtask branches off of feature branches (relating to Jira subtasks of Jira features). We are using Stash to assign a reviewer when we create a pull request to merge back into the parent branch (usually develop but for subtasks back into the feature branch). The problem we're finding is that even with the best planning and breakdown of feature cases, when multiple developers are working together on the same feature, say on the front-end and back-end, if they are working on interdependent code that is in separate branches one developer ends up blocking the other. We've tried pulling between each others' branches as we develop. We've also tried creating local integration branches each developer can pull from multiple branches to test the integration as they develop. Finally, and this seems to work possibly the best for us so far, though with a bit more overhead, we have tried creating an integration branch off of the feature branch right off the bat. When a subtask branch (off of the feature branch) is ready for a pull request and code review, we also manually merge those change sets into this feature integration branch. Then all interested developers are able to pull from that integration branch into other dependent subtask branches. This prevents anyone from waiting for any branch they are dependent upon to pass code review. I know this isn't necessarily a Git issue - it has to do with working on interdependent code in multiple branches, mixed with our own work process and culture. If we didn't have the strict code-review policy for develop (true integration branch) then developer 1 could merge to develop for developer 2 to pull from. Another complication is that we are also required to do some preliminary testing as part of the code review process before handing the feature off to QA.This means that even if front-end developer 1 is pulling directly from back-end developer 2's branch as they go, if back-end developer 2 finishes and his/her pull request is sitting in code review for a week, then front-end developer 2 technically can't create his pull request/code review because his/her code reviewer can't test because back-end developer 2's code hasn't been merged into develop yet. Bottom line is we're finding ourselves in a much more serial rather than parallel approach in these instance, depending on which route we go, and would like to find a process to use to avoid this. Last thing I'll mention is we realize by sharing code across branches that haven't been code reviewed and finalized yet we are in essence using the beta code of others. To a certain extent I don't think we can avoid that and are willing to accept that to a degree. Anyway, any ideas, input, etc... greatly appreciated. Thanks!

    Read the article

  • Izenda Reports 6.3 Top 10 Features

    - by gt0084e1
    Izenda 6.3 Top 10 New Features and Capabilities 1. Izenda Maps Add-On The Izenda Maps add-on allows rapid visualization of geographic or geo-spacial data.  It is fully integrated with the the rest of Izenda report package and adds a Maps tab which allows users to add interactive maps to their reports. Contact your representative or [email protected] for limited time discounts. Izenda Maps even has rich drill-down capabilities that allow you to dive deeper with a simple hover (also requires dashboards). 2. Streamlined Pie Charts with "Other" Slices The advanced properties of the Pie Chart now allows you to combine the smaller slices into a single "Other" slice. This reduces the visual complexity without throwing off the scale of the chart. Compare the difference below. 3. Combined Bar + Line Charts The Bar chart now allows dual visualization of multiple metrics simultaneously by adding a line for secondary data. Enabled via AdHocSettings.AllowLineOnBar = true; 4. Stacked Bar Charts The stacked bar chart lets you see a breakdown of a measure based on categorical data.  It is enabled with the following code. AdHocSettings.AllowStackedBarChart = true; 5. Self-Joining Data Sources The self-join features allows for parent-child relationships to be accessed from the Data Sources tab. The same table can be used as a secondary child table within the Report Designer. 6. Report Design From Dashboard View Dashboards now sport both view and design icons to allow quick access to both. 7. Field Arithmetic on Dates Differences between dates can now be used as measures with the arithmetic feature. 8. Simplified Multi-Tenancy Integrating with multi-tenant systems is now easier than ever. The following APIs have been added to facilitate common scenarios. AdHocSettings.CurrentUserTenantId = value; AdHocSettings.SchedulerTenantID = value; AdHocSettings.SchedulerTenantField = "AccountID"; 9. Support For SQL 2008 R2 and SQL Azure Izenda now supports the latest version of Microsoft's database as well as the SQL Azure service. 10. Enhanced Performance and Compatibility for Stored Procedures Izenda now supports more stored procedures than ever and runs them faster too.

    Read the article

  • Stop Saying "Multi-Channel!"

    - by David Dorf
    I keep hearing the term "multi-channel" in our industry, but its time to move on. It kinda reminds me of the term "ECR" or electronic cash register. Long ago ECR was a leading-edge term, but nowadays its rarely used because its table-stakes. After all, what cash register today isn't electronic? The same logic applies to multi-channel, at least when we're talking about tier-1 and tier-2 retailers. If you're still talking about multi-channel retailing, you're in big trouble. Some have switched over to the term "cross-channel," and that's a step in the right direction but still falls short. Its kinda like saying, "I upgraded my ECR to accept debit cards!" Yawn. Who hasn't? Today's retailers need to focus on omni-channel, which I first heard from my friends over at RSR but was originally coined at IDC. First retailers added e-commerce to their store and catalog channels yielding multi-channel retailing. Consumers could use the channel that worked best for them. Then some consumers wanted to combine channels with features like buy-on-the-Web, pickup-in-the-store. Thus began the cross-channel initiatives to breakdown the silos and enable the channels to communicate with each other. But the multi-channel architecture is full of duplication that thwarts efforts of providing a consistent experience. Each has its own cart, its own pricing, and often its own CRM. This was an outcrop of trying to bring the independent channels to market quickly. Rather than reusing and rebuilding existing components to meet the new demands, silos were created that continue to exist today. Today's consumers want omni-channel retailing. They want to interact with brands in a consistent manner that is channel transparent, yet optimized for that particular interaction. The diagram below, from the soon-to-be-released NRF Mobile Blueprint v2, shows this progression. For retailers to provide an omni-channel experience, there needs to be one logical representation of products, prices, promotions, and customers across all channels. The only thing that varies is the presentation of the content based on the delivery mechanism (e.g. shelf labels, mobile phone, web site, print, etc.) and often these mechanisms can be combined in various ways. I'm looking forward to the day in which I can use my phone to scan QR-codes in a catalog to create a shopping cart of items. Then do some further research on the retailer's Web site and be told about related items that might interest me. Be able to easily solicit opinions and reviews from social sites, and finally enter the store to pickup my items, knowing that any applicable coupons have been applied. In this scenario, I the consumer are dealing with a single brand that is aware of me and my needs throughout the entire transaction. Nirvana.

    Read the article

  • Chargeback and showback...both a 'throw back'

    - by llaszews
    Been getting asked again by customers and partners about chargeback and showback in the cloud so thought I would blog on my response to this question. Charge Back background, information and industry analysis: Cloud computing is all about shared resources. These shared resources are computer servers (including memory and CPU), network devices, hard disk storage, database servers, application servers, cooling, floor space, electricity and more. These resources are shared by departments within a company, or by a number of companies, when resources are hosted in the public or hybrid cloud. Currently, hosting providers that run other companies on their cloud platforms do not have an accurate way to measure the shared computing resources used by a specific user let alone used by a specific customer. Additionally, companies running their own cloud data centers, for private or hybrid clouds, have no way of measure and charging back the departments in the company that are using these shared cloud resources. In both cases, the lack of determine shared resource costs and to charge them back to the company, department or user that is using this resources is limited a clear measure of business benefit and impacting company’s ability to measure the Return on Investment (ROI). An IT chargeback system is an accounting strategy that applies the costs of IT services, hardware or software to the business unit in which they are used. This system contrasts with traditional IT accounting models in which a centralized department bears all of the IT costs in an organization and those costs are treated simply as corporate overhead. Showback involves showing the IT costs to a department or customer but not actually charging them for their IT usage. Showback is a gradual method of introducing chargeback into an enterprise. Most companies implement a show back mechanism before a full chargeback system is put in place. Oracle chargeback product: Oracle Enterprise Manager provides tools for defining detailed Chargeback plans spanning different metrics collected for each type of resources as well as defining Cost Centers for grouping costs across multiple developers. Chargeback plans can use not only usage based costs, but also configuration based costs (e.g. version of the platform) or fixed costs (e.g. flat-rate management fee). Chargeback has rich out of the box reports. Trending reports show how charge and resource consumption varies over time, while Summary reports show the breakdown of charges or usage by different dimensions such as Cost Center or Target Type. These reports help consumers in understanding how their charges relate to their consumption and also assist the IT department with budgeting and planning activities. With BI Publisher, the reports can be made available in a variety of formats such as PDF, HTML, Word, Excel or PowerPoint.

    Read the article

  • How to appear very professional on my first freelance project?

    - by iamserious
    I need some help on appearing very professional on my first freelance project. How / What should you do to achieve this? Background: I started as a full time web developer 18 months ago, two promotions later I am now a senior software engineer. I've never had any problems with designing / developing / coding a complicated system. I thought I could use some help for Christmas and I started bidding for a project and now I have one - from a very reputable lawyers association in London. I have no problem dealing with the actual implementation of the system, but I have no idea how to appear professional throughout the whole process. About the project: This lawyers association are starting a distant training courses and in addition to having a website to show off all their clients etc, they want a students area where their students would log in, download course materials allocated to them etc.. and an admin section where they assign courses to students / create new ones / upload materials etc.. Questions breakdown: 1) How should I start with the requirement gathering? Is using scrum a good idea, or should I use something like Volere Template - and what should I do with it? should I submit a copy to the client etc.. 2) How often should I meet the client? Would once a fortnight would be good? 3) What are the processes / protocols that I need to follow so that they would be satisfied with me and think that I am very professional 4) How much should I charge for the product? 5) How should I get a quote / contract / receipt for the whole project? 6) What are the steps that professional freelancers go through, during the life cycle of a project? My Research so far Looking at How much should I charge doesn't help.. I live in London, zone 1, though I have no idea how much a project of this size would cost. Help on this would be appreciated. How to be professional articles talks about the work / time management etc and not the actual process, what would real people do etc.. it's like academia theory, but not practical. If this needs revising, please let me know, do not close it because of whatever reason, I will edit the question or details to fit the needs. Thanks for reading a lengthy question.

    Read the article

  • How to test email spam scores with amavis?

    - by CaptSaltyJack
    I'd like a way to test a spam message to see its spam scores that SpamAssassin gives it. The SA db files (bayes_toks, etc) reside in /var/lib/amavis/.spamassassin. I've been testing emails by doing this: sudo su amavis -c 'spamassassin -t msgfile' Though this yields some strange results, such as: Content analysis details: (3.7 points, 5.0 required) pts rule name description ---- ---------------------- -------------------------------------------------- 3.5 BAYES_99 BODY: Bayes spam probability is 99 to 100% [score: 1.0000] -0.0 NO_RELAYS Informational: message was not relayed via SMTP 0.0 LONG_TERM_PRICE BODY: LONG_TERM_PRICE 0.2 BAYES_999 BODY: Bayes spam probability is 99.9 to 100% [score: 1.0000] -0.0 NO_RECEIVED Informational: message has no Received headers 0.2 is an awfully low scores for BAYES_999! But this is the first time I've used amavis, previously I've always just used spamassassin directly as a content filter in postfix, but apparently running amavis/spamassassin is more efficient. So, with amavis in the picture, how can I run a test on a message to see its spam score breakdown? Another email I ran a test on got this result: 2.0 BAYES_80 BODY: Bayes spam probability is 80 to 95% [score: 0.8487] Doesn't make sense, that BAYES_80 can yield a higher score than BAYES_999. Help!

    Read the article

< Previous Page | 1 2 3 4 5 6 7  | Next Page >