coalib.core package

Submodules

coalib.core.Bear module

class coalib.core.Bear.Bear(section: coalib.settings.Section.Section, file_dict: dict)[source]

Bases: object

A bear contains the actual subroutine that is responsible for checking source code for certain specifications. However, it can actually do whatever it wants with the files it gets.

This is the base class for every bear. If you want to write a bear, you will probably want to look at the ProjectBear and FileBear classes that inherit from this class.

To indicate which languages your bear supports, just give it the LANGUAGES value which should be a set of string(s):

>>> class SomeBear(Bear):
...     LANGUAGES = {'C', 'CPP', 'C#', 'D'}

To indicate the requirements of the bear, assign REQUIREMENTS a set with instances of PackageRequirements.

>>> from dependency_management.requirements.PackageRequirement import (
...     PackageRequirement)
>>> class SomeBear(Bear):
...     REQUIREMENTS = {
...         PackageRequirement('pip', 'coala_decorators', '0.2.1')}

If your bear uses requirements from a manager we have a subclass from, you can use the subclass, such as PipRequirement, without specifying manager:

>>> from dependency_management.requirements.PipRequirement import (
...     PipRequirement)
>>> class SomeBear(Bear):
...     REQUIREMENTS = {PipRequirement('coala_decorators', '0.2.1')}

To specify additional attributes to your bear, use the following:

>>> class SomeBear(Bear):
...     AUTHORS = {'Jon Snow'}
...     AUTHORS_EMAILS = {'[email protected]'}
...     MAINTAINERS = {'Catelyn Stark'}
...     MAINTAINERS_EMAILS = {'[email protected]'}
...     LICENSE = 'AGPL-3.0'
...     ASCIINEMA_URL = 'https://asciinema.org/a/80761'

If the maintainers are the same as the authors, they can be omitted:

>>> class SomeBear(Bear):
...     AUTHORS = {'Jon Snow'}
...     AUTHORS_EMAILS = {'[email protected]'}
>>> SomeBear.maintainers
{'Jon Snow'}
>>> SomeBear.maintainers_emails
{'[email protected]'}

If your bear needs to include local files, then specify it giving strings containing relative file paths to the INCLUDE_LOCAL_FILES set:

>>> class SomeBear(Bear):
...     INCLUDE_LOCAL_FILES = {'checkstyle.jar', 'google_checks.xml'}

To keep track easier of what a bear can do, simply tell it to the CAN_FIX and the CAN_DETECT sets. Possible values are:

>>> CAN_DETECT = {'Syntax', 'Formatting', 'Security', 'Complexity',
... 'Smell', 'Unused Code', 'Redundancy', 'Variable Misuse', 'Spelling',
... 'Memory Leak', 'Documentation', 'Duplication', 'Commented Code',
... 'Grammar', 'Missing Import', 'Unreachable Code', 'Undefined Element',
... 'Code Simplification'}
>>> CAN_FIX = {'Syntax', ...}

Specifying something to CAN_FIX makes it obvious that it can be detected too, so it may be omitted:

>>> class SomeBear(Bear):
...     CAN_DETECT = {'Syntax', 'Security'}
...     CAN_FIX = {'Redundancy'}
>>> sorted(SomeBear.can_detect)
['Redundancy', 'Security', 'Syntax']

Every bear has a data directory which is unique to that particular bear:

>>> class SomeBear(Bear): pass
>>> class SomeOtherBear(Bear): pass
>>> SomeBear.data_dir == SomeOtherBear.data_dir
False

A bear can be dependent from other bears. BEAR_DEPS contains bear classes that are to be executed before this bear gets executed. The results of these bears will then be passed inside self.dependency_results as a dict. The dict will have the name of the bear as key and a list of its results as values:

>>> class SomeBear(Bear): pass
>>> class SomeOtherBear(Bear):
...     BEAR_DEPS = {SomeBear}
>>> SomeOtherBear.BEAR_DEPS
{<class 'coalib.core.Bear.SomeBear'>}
ASCIINEMA_URL = ''
AUTHORS = set()
AUTHORS_EMAILS = set()
BEAR_DEPS = set()
CAN_DETECT = set()
CAN_FIX = set()
INCLUDE_LOCAL_FILES = set()
LANGUAGES = set()
LICENSE = ''
MAINTAINERS = set()
MAINTAINERS_EMAILS = set()
PLATFORMS = {'any'}
REQUIREMENTS = set()
analyze(*args, **kwargs)[source]

Performs the code analysis.

Returns:An iterable of results.
can_detect = set()
classmethod check_prerequisites()[source]

Checks whether needed runtime prerequisites of the bear are satisfied.

This function gets executed at construction.

Section value requirements shall be checked inside the run method.

>>> from dependency_management.requirements.PipRequirement import (
...     PipRequirement)
>>> class SomeBear(Bear):
...     REQUIREMENTS = {PipRequirement('pip')}
>>> SomeBear.check_prerequisites()
True
>>> class SomeOtherBear(Bear):
...     REQUIREMENTS = {PipRequirement('really_bad_package')}
>>> SomeOtherBear.check_prerequisites()
'Following requirements are not installed: really_bad_package (...)'
Returns:True if prerequisites are satisfied, else False or a string that serves a more detailed description of what’s missing.
data_dir = '/home/docs/.local/share/coala-bears/Bear'
dependency_results

Contains all dependency results.

This variable gets set during bear execution from the core and can be used from analyze.

Modifications to the returned dictionary while the core is running leads to undefined behaviour.

>>> section = Section('my-section')
>>> file_dict = {'file1.txt': ['']}
>>> bear = Bear(section, file_dict)
>>> bear.dependency_results
defaultdict(<class 'list'>, {})
>>> dependency_bear = Bear(section, file_dict)
>>> bear.dependency_results[type(dependency_bear)] += [1, 2]
>>> bear.dependency_results
defaultdict(<class 'list'>, {<class 'coalib.core.Bear.Bear'>: [1, 2]})
Returns:A dictionary with bear-types as keys and their results received.
classmethod download_cached_file(url, filename)[source]

Downloads the file if needed and caches it for the next time. If a download happens, the user will be informed.

Take a sane simple bear:

>>> section = Section('my-section')
>>> file_dict = {'file1.txt': ['']}
>>> bear = Bear(section, file_dict)

We can now carelessly query for a neat file that doesn’t exist yet:

>>> from os import remove
>>> if exists(join(bear.data_dir, 'a_file')):
...     remove(join(bear.data_dir, 'a_file'))
>>> file = bear.download_cached_file('https://github.com/', 'a_file')

If we download it again, it’ll be much faster as no download occurs:

>>> newfile = bear.download_cached_file(
...     'https://github.com/', 'a_file')
>>> newfile == file
True
Parameters:
  • url – The URL to download the file from.
  • filename – The filename it should get, e.g. “test.txt”.
Returns:

A full path to the file ready for you to use!

execute_task(args, kwargs)[source]

Executes a task.

By default returns a list of results collected from this bear.

This function has to return something that is picklable to make bears work in multi-process environments.

Parameters:
  • args – The arguments of a task.
  • kwargs – The keyword-arguments of a task.
Returns:

A list of results from the bear.

generate_tasks()[source]

This method is responsible for providing the job arguments analyze gets called with.

Returns:An iterable containing the positional and keyword arguments organized in pairs: (args-tuple, kwargs-dict)
get_config_dir()[source]

Gives the directory where the configuration file resides.

Returns:Directory of the config file.
classmethod get_metadata()[source]
Returns:Metadata for the analyze function extracted from its signature. Excludes parameter self.
classmethod get_non_optional_settings()[source]

This method has to determine which settings are needed by this bear. The user will be prompted for needed settings that are not available in the settings file so don’t include settings where a default value would do.

Note: This function also queries settings from bear dependencies in recursive manner. Though circular dependency chains are a challenge to achieve, this function would never return on them!

Returns:A dictionary of needed settings as keys and a tuple of help text and annotation as values
maintainers = set()
maintainers_emails = set()
name = 'Bear'
new_result

Returns a partial for creating a result with this bear already bound.

static setup_dependencies()[source]

This is a user defined function that can download and set up dependencies (via download_cached_file or arbitrary other means) in an OS independent way.

source_location = '/home/docs/checkouts/readthedocs.org/user_builds/coala-api/checkouts/latest/coalib/core/Bear.py'

coalib.core.Core module

coalib.core.Core.cleanup_bear(bear, running_tasks, event_loop)[source]

Cleans up state of an ongoing run for a bear.

  • If the given bear has no running tasks left, it removes the bear from the running_tasks dict.
  • Checks whether there are any remaining tasks, and quits the event loop accordingly if none are left.
Parameters:
  • bear – The bear to clean up state for.
  • running_tasks – The dict of running-tasks.
  • event_loop – The event-loop tasks are scheduled on.
coalib.core.Core.finish_task(bear, result_callback, running_tasks, event_loop, executor, task)[source]

The callback for when a task of a bear completes. It is responsible for checking if the bear completed its execution and the handling of the result generated by the task.

Parameters:
  • bear – The bear that the task belongs to.
  • result_callback – A callback function which is called when results are available.
  • running_tasks – Dictionary that keeps track of the remaining tasks of each bear.
  • event_loop – The asyncio event loop bear-tasks are scheduled on.
  • executor – The executor to which the bear tasks are scheduled.
  • task – The task that completed.
coalib.core.Core.run(bears, result_callback)[source]

Runs a coala session.

Parameters:
  • bears – The bear instances to run.
  • result_callback

    A callback function which is called when results are available. Must have following signature:

    def result_callback(result):
        pass
    
coalib.core.Core.schedule_bears(bears, result_callback, event_loop, running_tasks, executor)[source]

Schedules the tasks of bears to the given executor and runs them on the given event loop.

Parameters:
  • bears – A list of bear instances to be scheduled onto the process pool.
  • result_callback – A callback function which is called when results are available.
  • event_loop – The asyncio event loop to schedule bear tasks on.
  • running_tasks – Tasks that are already scheduled, organized in a dict with bear instances as keys and asyncio-coroutines as values containing their scheduled tasks.
  • executor – The executor to which the bear tasks are scheduled.

Module contents