## Perforce Server Log Analyzers Log analyzers are useful to identify common issues on a Perforce server quickly. For more background information, please see: http://answers.perforce.com/articles/KB/5470/ http://answers.perforce.com/articles/KB/2514/ ## log_analyzer.php This is a PHP script that turns a `P4LOG` into a SQLite database and generate canned reports from it. The report is useful to identify common performance issues on a Perforce Helix server and further queries can be done against the SQLite database created. For more information, please refer to the following KB article: http://answers.perforce.com/articles/KB/1266/ ## p4clog.py A Python script that scans through a `P4LOG` file quickly and generate data suitable for generating graphs. More information and examples can be found in: http://answers.perforce.com/articles/KB/3706/ ## log2sql.py The new kid on the block - similar to log_analyzer.php. Big advantage is that it has a fairly comprehensive test harness and processes all manner of recently observed log file scenarios. In addition, there is an associated Python Flask-based web app which runs various pre-canned queries and charts. It is setup to run inside a Docker container for ease of isolation of environment. Download zip file (or clone from Public depot): https://swarm.workshop.perforce.com/projects/perforce-software-log-analyzer/archives/main.zip Unzip into a directory, say $HOME/loganalyzer Easiest if you have Docker installed on your host system. The alternative is to run the app using Python (and Flask) - you need to install the dependencies: cd $HOME/loganalyzer/psla pip install -r requirements.txt Please note that pip can be installed as a package (e.g. python-pip), or directly from https://pip.pypa.io/en/stable/installing/ To run: export FLASK_APP=psla.py # Absolute path also possible export FLASK_DEBUG=1 export PSLA_LOGS=`pwd`/logs # Absolute path required flask run --host=0.0.0.0 --port=5000 Obviously you can change the port if desired. With Docker, use the Makefile (first build from scratch will take 5-10 mins): make build To run the container: make up The latter runs the web app on http://localhost:5000 Can be run manually: docker run --rm -p=5000:5000 --name=psla -v `pwd`/psla/logs:/logs psla Note that there is a page to upload a log file into the psla/logs directory inside the container. But the app/container is run with the psla/logs directory in under the current directory mounted as /logs inside the container - so files will persist when container is not running. Any generated database will also be put in this same directory. Especially if you have largeish log files, it is better to run the analysis first manually and put the results in log directory. Then run the container to analyse the results. So: cp p4d.log psla/logs cd psla/logs ../log2sql.py -d dbname psla.logs View progress information, and check resulting dbname.db is created. Example: $ ../log2sql.py -d dbname p4d.log 2018-03-28 11:01:47,083:INFO Processing p4d.log: 2018-03-28 11:01:47,137:INFO ...0% 2018-03-28 11:01:49,344:INFO ...10% 2018-03-28 11:01:51,096:INFO ...20% 2018-03-28 11:01:52,963:INFO ...30% 2018-03-28 11:01:55,561:INFO ...40% 2018-03-28 11:01:58,133:INFO ...50% 2018-03-28 11:02:00,594:INFO ...60% 2018-03-28 11:02:02,742:INFO ...70% 2018-03-28 11:02:04,194:INFO ...80% 2018-03-28 11:02:06,144:INFO ...90% 2018-03-28 11:02:07,962:INFO ...100% Then run docker container or web app as described above. Navigate to http://localhost:5000/analyseLog select your database name and click analyse.