Clone
1
run 2115 TEST commit e544c47
ci-bot edited this page 2025-08-12 10:04:19 +00:00

Test Report

View CI Run 2115 | Commit e544c47

🧪 Test Report

Generated on 2025-08-12 12:03:56 CEST

🧾 General Info

  • duration: 6.398661851882935
  • root: /workspace/tligui_y/slic
  • environment: {}

📋 Summary

  • Failed: 6
  • Total: 6
  • Collected: 6

🔎 Tests

Failed (6)
  • 📄 test_utils_elog.py

    Function: test_post_local

    • Test 1

      📌 Setup phase

      duration:

      0.00038533564656972885
      

      outcome:

      passed
      

      📌 Call phase

      duration:

      0.014231122098863125
      

      outcome:

      failed
      

      crash:

      path: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
      lineno: 601
      message: elog.logbook_exceptions.LogbookServerProblem: No response from the logbook server.
      Details: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      

      traceback:

      -   path: tests/test_utils_elog.py
        lineno: 45
        message: None
      -   path: .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
        lineno: 307
        message: in post
      -   path: .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
        lineno: 601
        message: LogbookServerProblem
      

      longrepr:

      self = <urllib3.connection.HTTPConnection object at 0x7fbde23baaf0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde2772af0>
      method = 'POST', url = '/demo/'
      body = b'--7a2706534276b5226635aaddb1126a25\r\nContent-Disposition: form-data; name="Author"\r\n\r\nrobot\r\n--7a2706534276b5...position: form-data; name="Text"; filename=""\r\n\r\nHello from local test\r\n--7a2706534276b5226635aaddb1126a25--\r\n'
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '1027', 'Content-Type': 'multipart/form-data; boundary=7a2706534276b5226635aaddb1126a25'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo/', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde23baaf0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde23baaf0>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde23ba670>
      request = <PreparedRequest [POST]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'POST', url = '/demo/', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde23baaf0>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde2772af0>
      _stacktrace = <traceback object at 0x7fbde27c6700>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde23baaf0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      self = <elog.logbook.Logbook object at 0x7fbde23ba100>
      message = 'Hello from local test', msg_id = None, reply = False
      attributes = {'Author': 'robot', 'Category': 'General', 'Encoding': 'HTML', 'Subject': 'Test simple', ...}
      attachments = [], suppress_email_notification = False, encoding = 'HTML'
      timeout = None, kwargs = {}
      new_attachment_list = [('Text', ('', b'Hello from local test'))]
      objects_to_close = []
      attributes_to_edit = {'Author': b'robot', 'Category': b'General', 'Encoding': b'HTML', 'Subject': b'Test simple', ...}
      
          def post(self, message, msg_id=None, reply=False, attributes=None, attachments=None,
                   suppress_email_notification=False, encoding=None, timeout=None, **kwargs):
              """
              Posts message to the logbook. If msg_id is not specified new message will be created, otherwise existing
              message will be edited, or a reply (if reply=True) to it will be created. This method returns the msg_id
              of the newly created message.
      
              :param message: string with message text
              :param msg_id: ID number of message to edit or reply. If not specified new message is created.
              :param reply: If 'True' reply to existing message is created instead of editing it
              :param attributes: Dictionary of attributes. Following attributes are used internally by the elog and will be
                                 ignored: Text, Date, Encoding, Reply to, In reply to, Locked by, Attachment
              :param attachments: list of:
                                        - file like objects which read() will return bytes (if file_like_object.name is not
                                          defined, default name "attachment<i>" will be used.
                                        - paths to the files
                                  All items will be appended as attachment to the elog entry. In case of unknown
                                  attachment an exception LogbookInvalidAttachment will be raised.
              :param suppress_email_notification: If set to True or 1, E-Mail notification will be suppressed, defaults to False.
              :param encoding: Defines encoding of the message. Can be: 'plain' -> plain text, 'html'->html-text,
                               'ELCode' --> elog formatting syntax
              :param timeout: Define the timeout to be used by the post request. Its value is directly passed to the requests
                              post. Use None to disable the request timeout.
              :param kwargs: Anything in the kwargs will be interpreted as attribute. e.g.: logbook.post('Test text',
                             Author='Rok Vintar), "Author" will be sent as an attribute. If named same as one of the
                             attributes defined in "attributes", kwargs will have priority.
      
              :return: msg_id
              """
      
              attributes = attributes or {}
              attributes = {**attributes, **kwargs}  # kwargs as attributes with higher priority
      
              attachments = attachments or []
      
              if encoding is not None:
                  if encoding not in ['plain', 'HTML', 'ELCode']:
                      raise LogbookMessageRejected('Invalid message encoding. Valid options: plain, HTML, ELCode.')
                  attributes['Encoding'] = encoding
      
              if suppress_email_notification:
                  attributes["suppress"] = 1
      
              # THE ATTACHMENT STRATEGY WHEN DEALING WITH POST MODIFICATION
              #
              # 1. Does the message on the server have already attachments?
              #    1.1 - We read the message getting the existing attachment list.
              #    1.2 - Add to the attributes dictionary one line for each attachment like this:
              #       attributes['attachmentN'] = timestamped_filename_name
              #
              # 2. Do we have new attachments?
              #    2.1 - Those are in the new_attachment_list. This is a list of this type:
              #       [ ('attfileN', ('filename', fileobject)) ]
              #    2.2 - We need to loop over all the new attachments:
              #       2.2.1 - Does a file already on the server with the same name exist?
              #         2.2.1.1 - No: OK. Then we go ahead with the next attachment.
              #         2.2.1.2 - Yes:
              #           2.2.1.2.1 - Are the two files identical?
              #               2.2.1.2.1.1 - Yes: then we remove this current entry from the new_attachment_list and we leave the one
              #                      already on server.
              #               2.2.1.2.1.2 - No:
              #                  2.2.1.2.1.2.1 - Then the file has been update.
              #                  2.2.1.2.1.2.2 - We need to remove the file on server first (using special post)
              #                  2.2.1.2.1.2.3 - We have to remove the old attachment from the attributes dictionary.
              #
      
              if attachments:
                  # here we accomplish point 2.1.
                  # new_attachment_list is something like [ ('attfileN', ('filename', fileobject)) ]
                  new_attachment_list, objects_to_close = self._prepare_attachments(attachments)
              else:
                  objects_to_close = list()
                  new_attachment_list = list()
      
              attributes_to_edit = dict()
              if msg_id:
                  # Message exists, we can continue
                  if reply:
                      # Verify that there is a message on the server, otherwise do not reply to it!
                      self._check_if_message_on_server(msg_id)  # raises exception in case of none existing message
                      attributes['reply_to'] = str(msg_id)
                  else:  # Edit existing
                      attributes['edit_id'] = str(msg_id)
                      attributes['skiplock'] = '1'
      
                      # here we accomplish point 1.1.
                      # existing_attachments_list is something like:
                      # [ 'https://elog.url.com/logbook/timestamped_filename' ]
                      msg_to_edit, attributes_to_edit, existing_attachments_list = self.read(msg_id)
      
                      for attribute, data in attributes.items():
                          new_data = attributes.get(attribute)
                          if new_data is not None:
                              attributes_to_edit[attribute] = new_data
      
                      i = 0
                      existing_attachments_filename_list = list()
                      for attachment in existing_attachments_list:
                          # here we accomplish point 1.2. We strip the timestamped_filename from the whole URL.
                          attributes_to_edit[f'attachment{i}'] = os.path.basename(attachment)
                          existing_attachments_filename_list.append(os.path.basename(attachment)[14:])
                          i += 1
      
                      # let's accomplish 2.2. Loop over all new attachment
                      duplicate_attachment_list = list()
                      for new_attachment in new_attachment_list:
                          # the new_attachment_list is something like:
                          # [ ('attfileN', ('filename', fileobject)) ]
                          new_attachment_filename = new_attachment[1][0]
                          if new_attachment_filename in existing_attachments_filename_list:
                              # a file with the same name existing already on the server.
                              # we need to check if the two files are the same.
                              # read the content of the new file
                              new_attachment_content = new_attachment[1][1].read()
                              # don't forget to reset the fileobj to the beginning of the file
                              new_attachment[1][1].seek(0)
                              # get the existing attachment content
                              attachment_index = existing_attachments_filename_list.index(new_attachment_filename)
                              existing_attachment_content = self.download_attachment(
                                  url=existing_attachments_list[attachment_index],
                                  timeout=timeout
                              )
                              # check if the two contents are the same
                              if new_attachment_content == existing_attachment_content:
                                  # yes. then we don't upload a second copy. we remove the current entry from the list
                                  duplicate_attachment_list.append(new_attachment)
                              else:
                                  # no. they are not the same file. we will replace the existing file with the new one
                                  # first: we need to remove the attachment from the server using the dedicated method
                                  self.delete_attachment(msg_id, attributes=attributes_to_edit,
                                                         attachment_id=attachment_index,
                                                         timeout=timeout, text=msg_to_edit)
                                  # now we can remove this attachment from the auxiliary lists.
                                  existing_attachments_filename_list.pop(attachment_index)
                                  existing_attachments_list.pop(attachment_index)
                                  # now we need to rebuild the attributes dictionary for the part concerning the attachments.
                                  # we remove all of them first
                                  keys_to_be_removed = list()
                                  for key in attributes_to_edit.keys():
                                      if key.startswith('attachment'):
                                          keys_to_be_removed.append(key)
                                      if key.startswith('delatt'):
                                          keys_to_be_removed.append(key)
                                  for key in keys_to_be_removed:
                                      del attributes_to_edit[key]
      
                                  # now we rebuild it
                                  for i, attachment in enumerate(existing_attachments_list):
                                      attributes_to_edit[f'attachment{i}'] = os.path.basename(attachment)
      
                      # remove all duplicate attachments from the new_attachment_list
                      for attach in duplicate_attachment_list:
                          new_attachment_list.remove(attach)
      
              else:
                  # As we create a new message, specify creation time if not already specified in attributes
                  if 'When' not in attributes:
                      attributes['When'] = int(datetime.now().timestamp())
      
              if not attributes_to_edit:
                  attributes_to_edit = attributes
      
              # Remove any attributes that should not be sent
              _remove_reserved_attributes(attributes_to_edit)
      
              # Make requests module think that Text is a "file". This is the only way to force requests to send data as
              # multipart/form-data even if there are no attachments. Elog understands only multipart/form-data
              new_attachment_list.append(('Text', ('', message.encode('iso-8859-1'))))
      
              # Base attributes are common to all messages
              self._add_base_msg_attributes(attributes_to_edit)
      
              # Keys in attributes cannot have certain characters like whitespaces or dashes for the http request
              attributes_to_edit = _replace_special_characters_in_attribute_keys(attributes_to_edit)
      
              # All string values in the attributes must be encoded in latin1
              attributes_to_edit = _encode_values(attributes_to_edit)
      
              try:
      >           response = requests.post(self._url, data=attributes_to_edit, files=new_attachment_list,
                                           allow_redirects=False, verify=False, timeout=timeout)
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:288: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:115: in post
          return request("post", url, data=data, json=json, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde23ba670>
      request = <PreparedRequest [POST]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde23baaf0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      
      During handling of the above exception, another exception occurred:
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbddd127910>
      method = 'GET', url = '/demo/None', body = None
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Cookie': 'unm=robot;upwd=me1T.2jUUqQNa1wNuey9zNBOmOa4eILOaPb.ZSZjpn4;'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo/None', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde23bac40>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'GET', url = '/demo/None', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbddd127910>
      _stacktrace = <traceback object at 0x7fbde27b7500>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      self = <elog.logbook.Logbook object at 0x7fbde23ba100>, msg_id = None
      timeout = None
      
          def _check_if_message_on_server(self, msg_id, timeout=None):
              """Try to load page for specific message. If there is a html tag like <td class="errormsg"> then there is no
              such message.
      
              :param msg_id: ID of message to be checked
              :params timeout: The value of timeout to be passed to the get request
              :return:
              """
      
              request_headers = dict()
              if self._user or self._password:
                  request_headers['Cookie'] = self._make_user_and_pswd_cookie()
              try:
      >           response = requests.get(self._url + str(msg_id), headers=request_headers, allow_redirects=False,
                                          verify=False, timeout=timeout)
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:581: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:73: in get
          return request("get", url, params=params, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde23bac40>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      
      During handling of the above exception, another exception occurred:
      
          def test_post_local():
              logbook = elog.open(
                  hostname="http://localhost",
                  port=8080,
                  user="robot",
                  password="testpassword",
                  use_ssl=False,
                  logbook="demo"
              )
              attributes = {
                  "Author": "robot",
                  "Subject": "Test simple",
                  "Category": "General",
              }
              message = "Hello from local test"
      >       msg_id = logbook.post(message, attributes=attributes, encoding="HTML")
      
      tests/test_utils_elog.py:45: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:307: in post
          self._check_if_message_on_server(msg_id)  # raises exceptions if no message or no response from server
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <elog.logbook.Logbook object at 0x7fbde23ba100>, msg_id = None
      timeout = None
      
          def _check_if_message_on_server(self, msg_id, timeout=None):
              """Try to load page for specific message. If there is a html tag like <td class="errormsg"> then there is no
              such message.
      
              :param msg_id: ID of message to be checked
              :params timeout: The value of timeout to be passed to the get request
              :return:
              """
      
              request_headers = dict()
              if self._user or self._password:
                  request_headers['Cookie'] = self._make_user_and_pswd_cookie()
              try:
                  response = requests.get(self._url + str(msg_id), headers=request_headers, allow_redirects=False,
                                          verify=False, timeout=timeout)
      
                  # If there is no message code 200 will be returned (OK) and _validate_response will not recognise it
                  # but there will be some error in the html code.
                  resp_message, resp_headers, resp_msg_id = _validate_response(response)
                  # If there is no message, code 200 will be returned (OK) but there will be some error indication in
                  # the html code.
                  if re.findall('<td.*?class="errormsg".*?>.*?</td>',
                                resp_message.decode('utf-8', 'ignore'),
                                flags=re.DOTALL):
                      raise LogbookInvalidMessageID('Message with ID: ' + str(msg_id) + ' does not exist on logbook.')
      
              except requests.Timeout as e:
                  # Catch here a timeout o the post request.
                  # Raise the logbook exception and let the user handle it
                  raise LogbookServerTimeout('{0} method cannot be completed because of a network timeout:\n' +
                                             '{1}'.format(sys._getframe().f_code.co_name, e))
      
              except requests.RequestException as e:
      >           raise LogbookServerProblem('No response from the logbook server.\nDetails: ' + '{0}'.format(e))
      E           elog.logbook_exceptions.LogbookServerProblem: No response from the logbook server.
      E           Details: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde27cd3d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:601: LogbookServerProblem
      

      📌 Teardown phase

      duration:

      0.0003813537769019604
      

      outcome:

      passed
      

    Function: test_get_default_elog_instance_with_direct_password_and_real_check

    • Test 2

      📌 Setup phase

      duration:

      0.00015097670257091522
      

      outcome:

      passed
      

      📌 Call phase

      duration:

      0.011280029080808163
      

      outcome:

      failed
      

      crash:

      path: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py
      lineno: 700
      message: requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2353b80>: Failed to establish a new connection: [Errno 111] Connection refused'))
      

      traceback:

      -   path: tests/test_utils_elog.py
        lineno: 63
        message: None
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/api.py
        lineno: 73
        message: in get
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/api.py
        lineno: 59
        message: in request
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py
        lineno: 589
        message: in request
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py
        lineno: 703
        message: in send
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py
        lineno: 700
        message: ConnectionError
      

      longrepr:

      self = <urllib3.connection.HTTPConnection object at 0x7fbde2353b80>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbdcf38f730>
      method = 'GET', url = '/demo', body = None
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cm9ib3Q6dGVzdHBhc3N3b3Jk'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde2353b80>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde2353b80>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde1848130>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = True
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'GET', url = '/demo', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2353b80>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbdcf38f730>
      _stacktrace = <traceback object at 0x7fbde18b2e80>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2353b80>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
          def test_get_default_elog_instance_with_direct_password_and_real_check():
              url = "http://localhost:8080/demo"
              user = "robot"
              password = "testpassword"
      
              elog_instance, returned_user = get_default_elog_instance(url, user=user, password=password)
      
              assert returned_user == user
              assert hasattr(elog_instance, "post")
      
      >       r = requests.get(url, auth=(user, password))
      
      tests/test_utils_elog.py:63: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:73: in get
          return request("get", url, params=params, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde1848130>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = True
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2353b80>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      

      📌 Teardown phase

      duration:

      0.0002748859114944935
      

      outcome:

      passed
      

    Function: test_get_default_elog_instance_asks_password_and_opens

    • Test 3

      📌 Setup phase

      duration:

      0.00014234893023967743
      

      outcome:

      passed
      

      📌 Call phase

      duration:

      0.010521035175770521
      

      outcome:

      failed
      

      crash:

      path: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py
      lineno: 700
      message: requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      

      traceback:

      -   path: tests/test_utils_elog.py
        lineno: 80
        message: None
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/api.py
        lineno: 73
        message: in get
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/api.py
        lineno: 59
        message: in request
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py
        lineno: 589
        message: in request
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py
        lineno: 703
        message: in send
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py
        lineno: 700
        message: ConnectionError
      

      longrepr:

      self = <urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde29144f0>
      method = 'GET', url = '/demo', body = None
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cm9ib3Q6dGVzdHBhc3N3b3Jk'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde29141f0>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = True
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'GET', url = '/demo', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde29144f0>
      _stacktrace = <traceback object at 0x7fbde20977c0>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      mock_home = <MagicMock name='home' id='140453347989968'>
      mock_getpass = <MagicMock name='getpass' id='140453521379824'>
      
          @patch("slic.utils.elog.getpass")
          @patch("slic.utils.elog.Path.home")
          def test_get_default_elog_instance_asks_password_and_opens(mock_home, mock_getpass):
              mock_home.return_value = Path("/does/not/exist")  # Fausse home → lecture échoue
              mock_getpass.return_value = "testpassword"
      
              url = "http://localhost:8080/demo"
              user = "robot"
      
              elog_instance, returned_user = get_default_elog_instance(url, user=user)
      
              assert returned_user == user
              assert hasattr(elog_instance, "post")
      
      >       r = requests.get(url, auth=(user, mock_getpass.return_value))
      
      tests/test_utils_elog.py:80: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:73: in get
          return request("get", url, params=params, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde29141f0>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = True
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde26bbbe0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      

      📌 Teardown phase

      duration:

      0.00026375195011496544
      

      outcome:

      passed
      

    Function: test_get_default_elog_with_path_home

    • Test 4

      📌 Setup phase

      duration:

      0.00015332410112023354
      

      outcome:

      passed
      

      📌 Call phase

      duration:

      0.006974678952246904
      

      outcome:

      failed
      

      crash:

      path: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py
      lineno: 700
      message: requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2961730>: Failed to establish a new connection: [Errno 111] Connection refused'))
      

      traceback:

      -   path: tests/test_utils_elog.py
        lineno: 107
        message: None
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/api.py
        lineno: 73
        message: in get
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/api.py
        lineno: 59
        message: in request
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py
        lineno: 589
        message: in request
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py
        lineno: 703
        message: in send
      -   path: .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py
        lineno: 700
        message: ConnectionError
      

      longrepr:

      self = <urllib3.connection.HTTPConnection object at 0x7fbde2961730>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde22bb370>
      method = 'GET', url = '/demo', body = None
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Authorization': 'Basic cm9ib3Q6dGVzdHBhc3N3b3Jk'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde2961730>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde2961730>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde22bb640>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = True
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'GET', url = '/demo', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2961730>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde22bb370>
      _stacktrace = <traceback object at 0x7fbde2b09100>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2961730>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      mock_home = <MagicMock name='home' id='140453528774832'>
      mock_getuser = <MagicMock name='getuser' id='140453528688912'>
      mock_getpass = <MagicMock name='getpass' id='140453520668464'>
      
          @patch("slic.utils.elog.getpass")
          @patch("slic.utils.elog.getuser")
          @patch("slic.utils.elog.Path.home")
          def test_get_default_elog_with_path_home(mock_home, mock_getuser, mock_getpass):
              fake_user = "robot"
              fake_pw = "testpassword"
              mock_getuser.return_value = fake_user
              mock_getpass.return_value = fake_pw  # fallback safety
      
              tmp_home = Path("/tmp/fake_home_for_robot")
              tmp_home.mkdir(parents=True, exist_ok=True)
              pw_file = tmp_home / ".elog_psi"
              pw_file.write_text(fake_pw)
              mock_home.return_value = tmp_home
      
              url = "http://localhost:8080/demo"
      
              try:
                  elog_instance, returned_user = get_default_elog_instance(url)
      
                  assert returned_user == fake_user
                  assert hasattr(elog_instance, "post")
      
      >           r = requests.get(url, auth=(fake_user, fake_pw))
      
      tests/test_utils_elog.py:107: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:73: in get
          return request("get", url, params=params, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde22bb640>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = True
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2961730>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      

      📌 Teardown phase

      duration:

      0.00026803603395819664
      

      outcome:

      passed
      

    Function: test_post

    • Test 5

      📌 Setup phase

      duration:

      0.00013916892930865288
      

      outcome:

      passed
      

      📌 Call phase

      duration:

      0.0074578807689249516
      

      outcome:

      failed
      

      crash:

      path: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
      lineno: 601
      message: elog.logbook_exceptions.LogbookServerProblem: No response from the logbook server.
      Details: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde29de400>: Failed to establish a new connection: [Errno 111] Connection refused'))
      

      traceback:

      -   path: tests/test_utils_elog.py
        lineno: 122
        message: None
      -   path: slic/utils/elog.py
        lineno: 16
        message: in post
      -   path: .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
        lineno: 307
        message: in post
      -   path: .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
        lineno: 601
        message: LogbookServerProblem
      

      longrepr:

      self = <urllib3.connection.HTTPConnection object at 0x7fbddcb55a60>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde29c22e0>
      method = 'POST', url = '/demo/'
      body = b'--b6984113fe407809475136a73a8ea47b\r\nContent-Disposition: form-data; name="Author"\r\n\r\nrobot\r\n--b6984113fe4078...-Disposition: form-data; name="Text"; filename=""\r\n\r\nThis is a message\r\n--b6984113fe407809475136a73a8ea47b--\r\n'
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '735', 'Content-Type': 'multipart/form-data; boundary=b6984113fe407809475136a73a8ea47b'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo/', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbddcb55a60>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbddcb55a60>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde29c2f40>
      request = <PreparedRequest [POST]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'POST', url = '/demo/', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbddcb55a60>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde29c22e0>
      _stacktrace = <traceback object at 0x7fbde2767440>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbddcb55a60>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      self = <elog.logbook.Logbook object at 0x7fbde29c2670>
      message = 'This is a message', msg_id = None, reply = False
      attributes = {'Author': 'robot', 'When': 1754993032, 'cmd': 'Submit', 'exp': 'demo', ...}
      attachments = [], suppress_email_notification = False, encoding = None
      timeout = None, kwargs = {'Author': 'robot'}
      new_attachment_list = [('Text', ('', b'This is a message'))]
      objects_to_close = []
      attributes_to_edit = {'Author': b'robot', 'When': 1754993032, 'cmd': b'Submit', 'exp': b'demo', ...}
      
          def post(self, message, msg_id=None, reply=False, attributes=None, attachments=None,
                   suppress_email_notification=False, encoding=None, timeout=None, **kwargs):
              """
              Posts message to the logbook. If msg_id is not specified new message will be created, otherwise existing
              message will be edited, or a reply (if reply=True) to it will be created. This method returns the msg_id
              of the newly created message.
      
              :param message: string with message text
              :param msg_id: ID number of message to edit or reply. If not specified new message is created.
              :param reply: If 'True' reply to existing message is created instead of editing it
              :param attributes: Dictionary of attributes. Following attributes are used internally by the elog and will be
                                 ignored: Text, Date, Encoding, Reply to, In reply to, Locked by, Attachment
              :param attachments: list of:
                                        - file like objects which read() will return bytes (if file_like_object.name is not
                                          defined, default name "attachment<i>" will be used.
                                        - paths to the files
                                  All items will be appended as attachment to the elog entry. In case of unknown
                                  attachment an exception LogbookInvalidAttachment will be raised.
              :param suppress_email_notification: If set to True or 1, E-Mail notification will be suppressed, defaults to False.
              :param encoding: Defines encoding of the message. Can be: 'plain' -> plain text, 'html'->html-text,
                               'ELCode' --> elog formatting syntax
              :param timeout: Define the timeout to be used by the post request. Its value is directly passed to the requests
                              post. Use None to disable the request timeout.
              :param kwargs: Anything in the kwargs will be interpreted as attribute. e.g.: logbook.post('Test text',
                             Author='Rok Vintar), "Author" will be sent as an attribute. If named same as one of the
                             attributes defined in "attributes", kwargs will have priority.
      
              :return: msg_id
              """
      
              attributes = attributes or {}
              attributes = {**attributes, **kwargs}  # kwargs as attributes with higher priority
      
              attachments = attachments or []
      
              if encoding is not None:
                  if encoding not in ['plain', 'HTML', 'ELCode']:
                      raise LogbookMessageRejected('Invalid message encoding. Valid options: plain, HTML, ELCode.')
                  attributes['Encoding'] = encoding
      
              if suppress_email_notification:
                  attributes["suppress"] = 1
      
              # THE ATTACHMENT STRATEGY WHEN DEALING WITH POST MODIFICATION
              #
              # 1. Does the message on the server have already attachments?
              #    1.1 - We read the message getting the existing attachment list.
              #    1.2 - Add to the attributes dictionary one line for each attachment like this:
              #       attributes['attachmentN'] = timestamped_filename_name
              #
              # 2. Do we have new attachments?
              #    2.1 - Those are in the new_attachment_list. This is a list of this type:
              #       [ ('attfileN', ('filename', fileobject)) ]
              #    2.2 - We need to loop over all the new attachments:
              #       2.2.1 - Does a file already on the server with the same name exist?
              #         2.2.1.1 - No: OK. Then we go ahead with the next attachment.
              #         2.2.1.2 - Yes:
              #           2.2.1.2.1 - Are the two files identical?
              #               2.2.1.2.1.1 - Yes: then we remove this current entry from the new_attachment_list and we leave the one
              #                      already on server.
              #               2.2.1.2.1.2 - No:
              #                  2.2.1.2.1.2.1 - Then the file has been update.
              #                  2.2.1.2.1.2.2 - We need to remove the file on server first (using special post)
              #                  2.2.1.2.1.2.3 - We have to remove the old attachment from the attributes dictionary.
              #
      
              if attachments:
                  # here we accomplish point 2.1.
                  # new_attachment_list is something like [ ('attfileN', ('filename', fileobject)) ]
                  new_attachment_list, objects_to_close = self._prepare_attachments(attachments)
              else:
                  objects_to_close = list()
                  new_attachment_list = list()
      
              attributes_to_edit = dict()
              if msg_id:
                  # Message exists, we can continue
                  if reply:
                      # Verify that there is a message on the server, otherwise do not reply to it!
                      self._check_if_message_on_server(msg_id)  # raises exception in case of none existing message
                      attributes['reply_to'] = str(msg_id)
                  else:  # Edit existing
                      attributes['edit_id'] = str(msg_id)
                      attributes['skiplock'] = '1'
      
                      # here we accomplish point 1.1.
                      # existing_attachments_list is something like:
                      # [ 'https://elog.url.com/logbook/timestamped_filename' ]
                      msg_to_edit, attributes_to_edit, existing_attachments_list = self.read(msg_id)
      
                      for attribute, data in attributes.items():
                          new_data = attributes.get(attribute)
                          if new_data is not None:
                              attributes_to_edit[attribute] = new_data
      
                      i = 0
                      existing_attachments_filename_list = list()
                      for attachment in existing_attachments_list:
                          # here we accomplish point 1.2. We strip the timestamped_filename from the whole URL.
                          attributes_to_edit[f'attachment{i}'] = os.path.basename(attachment)
                          existing_attachments_filename_list.append(os.path.basename(attachment)[14:])
                          i += 1
      
                      # let's accomplish 2.2. Loop over all new attachment
                      duplicate_attachment_list = list()
                      for new_attachment in new_attachment_list:
                          # the new_attachment_list is something like:
                          # [ ('attfileN', ('filename', fileobject)) ]
                          new_attachment_filename = new_attachment[1][0]
                          if new_attachment_filename in existing_attachments_filename_list:
                              # a file with the same name existing already on the server.
                              # we need to check if the two files are the same.
                              # read the content of the new file
                              new_attachment_content = new_attachment[1][1].read()
                              # don't forget to reset the fileobj to the beginning of the file
                              new_attachment[1][1].seek(0)
                              # get the existing attachment content
                              attachment_index = existing_attachments_filename_list.index(new_attachment_filename)
                              existing_attachment_content = self.download_attachment(
                                  url=existing_attachments_list[attachment_index],
                                  timeout=timeout
                              )
                              # check if the two contents are the same
                              if new_attachment_content == existing_attachment_content:
                                  # yes. then we don't upload a second copy. we remove the current entry from the list
                                  duplicate_attachment_list.append(new_attachment)
                              else:
                                  # no. they are not the same file. we will replace the existing file with the new one
                                  # first: we need to remove the attachment from the server using the dedicated method
                                  self.delete_attachment(msg_id, attributes=attributes_to_edit,
                                                         attachment_id=attachment_index,
                                                         timeout=timeout, text=msg_to_edit)
                                  # now we can remove this attachment from the auxiliary lists.
                                  existing_attachments_filename_list.pop(attachment_index)
                                  existing_attachments_list.pop(attachment_index)
                                  # now we need to rebuild the attributes dictionary for the part concerning the attachments.
                                  # we remove all of them first
                                  keys_to_be_removed = list()
                                  for key in attributes_to_edit.keys():
                                      if key.startswith('attachment'):
                                          keys_to_be_removed.append(key)
                                      if key.startswith('delatt'):
                                          keys_to_be_removed.append(key)
                                  for key in keys_to_be_removed:
                                      del attributes_to_edit[key]
      
                                  # now we rebuild it
                                  for i, attachment in enumerate(existing_attachments_list):
                                      attributes_to_edit[f'attachment{i}'] = os.path.basename(attachment)
      
                      # remove all duplicate attachments from the new_attachment_list
                      for attach in duplicate_attachment_list:
                          new_attachment_list.remove(attach)
      
              else:
                  # As we create a new message, specify creation time if not already specified in attributes
                  if 'When' not in attributes:
                      attributes['When'] = int(datetime.now().timestamp())
      
              if not attributes_to_edit:
                  attributes_to_edit = attributes
      
              # Remove any attributes that should not be sent
              _remove_reserved_attributes(attributes_to_edit)
      
              # Make requests module think that Text is a "file". This is the only way to force requests to send data as
              # multipart/form-data even if there are no attachments. Elog understands only multipart/form-data
              new_attachment_list.append(('Text', ('', message.encode('iso-8859-1'))))
      
              # Base attributes are common to all messages
              self._add_base_msg_attributes(attributes_to_edit)
      
              # Keys in attributes cannot have certain characters like whitespaces or dashes for the http request
              attributes_to_edit = _replace_special_characters_in_attribute_keys(attributes_to_edit)
      
              # All string values in the attributes must be encoded in latin1
              attributes_to_edit = _encode_values(attributes_to_edit)
      
              try:
      >           response = requests.post(self._url, data=attributes_to_edit, files=new_attachment_list,
                                           allow_redirects=False, verify=False, timeout=timeout)
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:288: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:115: in post
          return request("post", url, data=data, json=json, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde29c2f40>
      request = <PreparedRequest [POST]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbddcb55a60>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      
      During handling of the above exception, another exception occurred:
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde29de400>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde29de700>
      method = 'GET', url = '/demo/None', body = None
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Cookie': 'unm=robot;upwd=me1T.2jUUqQNa1wNuey9zNBOmOa4eILOaPb.ZSZjpn4;'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo/None', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde29de400>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde29de400>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde29de8b0>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'GET', url = '/demo/None', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde29de400>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde29de700>
      _stacktrace = <traceback object at 0x7fbde29c9a00>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde29de400>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      self = <elog.logbook.Logbook object at 0x7fbde29c2670>, msg_id = None
      timeout = None
      
          def _check_if_message_on_server(self, msg_id, timeout=None):
              """Try to load page for specific message. If there is a html tag like <td class="errormsg"> then there is no
              such message.
      
              :param msg_id: ID of message to be checked
              :params timeout: The value of timeout to be passed to the get request
              :return:
              """
      
              request_headers = dict()
              if self._user or self._password:
                  request_headers['Cookie'] = self._make_user_and_pswd_cookie()
              try:
      >           response = requests.get(self._url + str(msg_id), headers=request_headers, allow_redirects=False,
                                          verify=False, timeout=timeout)
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:581: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:73: in get
          return request("get", url, params=params, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde29de8b0>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde29de400>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      
      During handling of the above exception, another exception occurred:
      
          def test_post():
              elog = get_test_elog()
      
              title = "AUTHOR_OVERRIDE_TEST"
              text = "This is a message"
              author = "robot"
      
      >       elog.post(text, attributes = {"Author": author})
      
      tests/test_utils_elog.py:122: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      slic/utils/elog.py:16: in post
          return self._log.post(*args, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:307: in post
          self._check_if_message_on_server(msg_id)  # raises exceptions if no message or no response from server
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <elog.logbook.Logbook object at 0x7fbde29c2670>, msg_id = None
      timeout = None
      
          def _check_if_message_on_server(self, msg_id, timeout=None):
              """Try to load page for specific message. If there is a html tag like <td class="errormsg"> then there is no
              such message.
      
              :param msg_id: ID of message to be checked
              :params timeout: The value of timeout to be passed to the get request
              :return:
              """
      
              request_headers = dict()
              if self._user or self._password:
                  request_headers['Cookie'] = self._make_user_and_pswd_cookie()
              try:
                  response = requests.get(self._url + str(msg_id), headers=request_headers, allow_redirects=False,
                                          verify=False, timeout=timeout)
      
                  # If there is no message code 200 will be returned (OK) and _validate_response will not recognise it
                  # but there will be some error in the html code.
                  resp_message, resp_headers, resp_msg_id = _validate_response(response)
                  # If there is no message, code 200 will be returned (OK) but there will be some error indication in
                  # the html code.
                  if re.findall('<td.*?class="errormsg".*?>.*?</td>',
                                resp_message.decode('utf-8', 'ignore'),
                                flags=re.DOTALL):
                      raise LogbookInvalidMessageID('Message with ID: ' + str(msg_id) + ' does not exist on logbook.')
      
              except requests.Timeout as e:
                  # Catch here a timeout o the post request.
                  # Raise the logbook exception and let the user handle it
                  raise LogbookServerTimeout('{0} method cannot be completed because of a network timeout:\n' +
                                             '{1}'.format(sys._getframe().f_code.co_name, e))
      
              except requests.RequestException as e:
      >           raise LogbookServerProblem('No response from the logbook server.\nDetails: ' + '{0}'.format(e))
      E           elog.logbook_exceptions.LogbookServerProblem: No response from the logbook server.
      E           Details: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde29de400>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:601: LogbookServerProblem
      

      📌 Teardown phase

      duration:

      0.0002709999680519104
      

      outcome:

      passed
      

    Function: test_screenshot

    • Test 6

      📌 Setup phase

      duration:

      0.00014484720304608345
      

      outcome:

      passed
      

      📌 Call phase

      duration:

      0.009213382378220558
      

      outcome:

      failed
      

      crash:

      path: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
      lineno: 601
      message: elog.logbook_exceptions.LogbookServerProblem: No response from the logbook server.
      Details: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>: Failed to establish a new connection: [Errno 111] Connection refused'))
      

      traceback:

      -   path: tests/test_utils_elog.py
        lineno: 143
        message: None
      -   path: slic/utils/elog.py
        lineno: 21
        message: in screenshot
      -   path: slic/utils/elog.py
        lineno: 16
        message: in post
      -   path: .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
        lineno: 307
        message: in post
      -   path: .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py
        lineno: 601
        message: LogbookServerProblem
      

      longrepr:

      self = <urllib3.connection.HTTPConnection object at 0x7fbde2b0dfa0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde2be1b20>
      method = 'POST', url = '/demo/'
      body = b'--a1e2d051ffbd8995fe3c14cfa1632476\r\nContent-Disposition: form-data; name="Author"\r\n\r\nrobot\r\n--a1e2d051ffbd89...-data; name="Text"; filename=""\r\n\r\nSCREENSHOT_INTEGRATION_TEST_MSG_456\r\n--a1e2d051ffbd8995fe3c14cfa1632476--\r\n'
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '885', 'Content-Type': 'multipart/form-data; boundary=a1e2d051ffbd8995fe3c14cfa1632476'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo/', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde2b0dfa0>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde2b0dfa0>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde21191c0>
      request = <PreparedRequest [POST]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'POST', url = '/demo/', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0dfa0>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde2be1b20>
      _stacktrace = <traceback object at 0x7fbde22b2a80>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0dfa0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      self = <elog.logbook.Logbook object at 0x7fbde2119c40>
      message = 'SCREENSHOT_INTEGRATION_TEST_MSG_456', msg_id = None, reply = False
      attributes = {'Author': 'robot', 'When': 1754993033, 'cmd': 'Submit', 'exp': 'demo', ...}
      attachments = ['/tmp/tmpulvmsf9g.png'], suppress_email_notification = False
      encoding = None, timeout = None, kwargs = {'Author': 'robot'}
      new_attachment_list = [('attfile0', ('tmpulvmsf9g.png', <_io.BufferedReader name='/tmp/tmpulvmsf9g.png'>)), ('Text', ('', b'SCREENSHOT_INTEGRATION_TEST_MSG_456'))]
      objects_to_close = [<_io.BufferedReader name='/tmp/tmpulvmsf9g.png'>]
      attributes_to_edit = {'Author': b'robot', 'When': 1754993033, 'cmd': b'Submit', 'exp': b'demo', ...}
      
          def post(self, message, msg_id=None, reply=False, attributes=None, attachments=None,
                   suppress_email_notification=False, encoding=None, timeout=None, **kwargs):
              """
              Posts message to the logbook. If msg_id is not specified new message will be created, otherwise existing
              message will be edited, or a reply (if reply=True) to it will be created. This method returns the msg_id
              of the newly created message.
      
              :param message: string with message text
              :param msg_id: ID number of message to edit or reply. If not specified new message is created.
              :param reply: If 'True' reply to existing message is created instead of editing it
              :param attributes: Dictionary of attributes. Following attributes are used internally by the elog and will be
                                 ignored: Text, Date, Encoding, Reply to, In reply to, Locked by, Attachment
              :param attachments: list of:
                                        - file like objects which read() will return bytes (if file_like_object.name is not
                                          defined, default name "attachment<i>" will be used.
                                        - paths to the files
                                  All items will be appended as attachment to the elog entry. In case of unknown
                                  attachment an exception LogbookInvalidAttachment will be raised.
              :param suppress_email_notification: If set to True or 1, E-Mail notification will be suppressed, defaults to False.
              :param encoding: Defines encoding of the message. Can be: 'plain' -> plain text, 'html'->html-text,
                               'ELCode' --> elog formatting syntax
              :param timeout: Define the timeout to be used by the post request. Its value is directly passed to the requests
                              post. Use None to disable the request timeout.
              :param kwargs: Anything in the kwargs will be interpreted as attribute. e.g.: logbook.post('Test text',
                             Author='Rok Vintar), "Author" will be sent as an attribute. If named same as one of the
                             attributes defined in "attributes", kwargs will have priority.
      
              :return: msg_id
              """
      
              attributes = attributes or {}
              attributes = {**attributes, **kwargs}  # kwargs as attributes with higher priority
      
              attachments = attachments or []
      
              if encoding is not None:
                  if encoding not in ['plain', 'HTML', 'ELCode']:
                      raise LogbookMessageRejected('Invalid message encoding. Valid options: plain, HTML, ELCode.')
                  attributes['Encoding'] = encoding
      
              if suppress_email_notification:
                  attributes["suppress"] = 1
      
              # THE ATTACHMENT STRATEGY WHEN DEALING WITH POST MODIFICATION
              #
              # 1. Does the message on the server have already attachments?
              #    1.1 - We read the message getting the existing attachment list.
              #    1.2 - Add to the attributes dictionary one line for each attachment like this:
              #       attributes['attachmentN'] = timestamped_filename_name
              #
              # 2. Do we have new attachments?
              #    2.1 - Those are in the new_attachment_list. This is a list of this type:
              #       [ ('attfileN', ('filename', fileobject)) ]
              #    2.2 - We need to loop over all the new attachments:
              #       2.2.1 - Does a file already on the server with the same name exist?
              #         2.2.1.1 - No: OK. Then we go ahead with the next attachment.
              #         2.2.1.2 - Yes:
              #           2.2.1.2.1 - Are the two files identical?
              #               2.2.1.2.1.1 - Yes: then we remove this current entry from the new_attachment_list and we leave the one
              #                      already on server.
              #               2.2.1.2.1.2 - No:
              #                  2.2.1.2.1.2.1 - Then the file has been update.
              #                  2.2.1.2.1.2.2 - We need to remove the file on server first (using special post)
              #                  2.2.1.2.1.2.3 - We have to remove the old attachment from the attributes dictionary.
              #
      
              if attachments:
                  # here we accomplish point 2.1.
                  # new_attachment_list is something like [ ('attfileN', ('filename', fileobject)) ]
                  new_attachment_list, objects_to_close = self._prepare_attachments(attachments)
              else:
                  objects_to_close = list()
                  new_attachment_list = list()
      
              attributes_to_edit = dict()
              if msg_id:
                  # Message exists, we can continue
                  if reply:
                      # Verify that there is a message on the server, otherwise do not reply to it!
                      self._check_if_message_on_server(msg_id)  # raises exception in case of none existing message
                      attributes['reply_to'] = str(msg_id)
                  else:  # Edit existing
                      attributes['edit_id'] = str(msg_id)
                      attributes['skiplock'] = '1'
      
                      # here we accomplish point 1.1.
                      # existing_attachments_list is something like:
                      # [ 'https://elog.url.com/logbook/timestamped_filename' ]
                      msg_to_edit, attributes_to_edit, existing_attachments_list = self.read(msg_id)
      
                      for attribute, data in attributes.items():
                          new_data = attributes.get(attribute)
                          if new_data is not None:
                              attributes_to_edit[attribute] = new_data
      
                      i = 0
                      existing_attachments_filename_list = list()
                      for attachment in existing_attachments_list:
                          # here we accomplish point 1.2. We strip the timestamped_filename from the whole URL.
                          attributes_to_edit[f'attachment{i}'] = os.path.basename(attachment)
                          existing_attachments_filename_list.append(os.path.basename(attachment)[14:])
                          i += 1
      
                      # let's accomplish 2.2. Loop over all new attachment
                      duplicate_attachment_list = list()
                      for new_attachment in new_attachment_list:
                          # the new_attachment_list is something like:
                          # [ ('attfileN', ('filename', fileobject)) ]
                          new_attachment_filename = new_attachment[1][0]
                          if new_attachment_filename in existing_attachments_filename_list:
                              # a file with the same name existing already on the server.
                              # we need to check if the two files are the same.
                              # read the content of the new file
                              new_attachment_content = new_attachment[1][1].read()
                              # don't forget to reset the fileobj to the beginning of the file
                              new_attachment[1][1].seek(0)
                              # get the existing attachment content
                              attachment_index = existing_attachments_filename_list.index(new_attachment_filename)
                              existing_attachment_content = self.download_attachment(
                                  url=existing_attachments_list[attachment_index],
                                  timeout=timeout
                              )
                              # check if the two contents are the same
                              if new_attachment_content == existing_attachment_content:
                                  # yes. then we don't upload a second copy. we remove the current entry from the list
                                  duplicate_attachment_list.append(new_attachment)
                              else:
                                  # no. they are not the same file. we will replace the existing file with the new one
                                  # first: we need to remove the attachment from the server using the dedicated method
                                  self.delete_attachment(msg_id, attributes=attributes_to_edit,
                                                         attachment_id=attachment_index,
                                                         timeout=timeout, text=msg_to_edit)
                                  # now we can remove this attachment from the auxiliary lists.
                                  existing_attachments_filename_list.pop(attachment_index)
                                  existing_attachments_list.pop(attachment_index)
                                  # now we need to rebuild the attributes dictionary for the part concerning the attachments.
                                  # we remove all of them first
                                  keys_to_be_removed = list()
                                  for key in attributes_to_edit.keys():
                                      if key.startswith('attachment'):
                                          keys_to_be_removed.append(key)
                                      if key.startswith('delatt'):
                                          keys_to_be_removed.append(key)
                                  for key in keys_to_be_removed:
                                      del attributes_to_edit[key]
      
                                  # now we rebuild it
                                  for i, attachment in enumerate(existing_attachments_list):
                                      attributes_to_edit[f'attachment{i}'] = os.path.basename(attachment)
      
                      # remove all duplicate attachments from the new_attachment_list
                      for attach in duplicate_attachment_list:
                          new_attachment_list.remove(attach)
      
              else:
                  # As we create a new message, specify creation time if not already specified in attributes
                  if 'When' not in attributes:
                      attributes['When'] = int(datetime.now().timestamp())
      
              if not attributes_to_edit:
                  attributes_to_edit = attributes
      
              # Remove any attributes that should not be sent
              _remove_reserved_attributes(attributes_to_edit)
      
              # Make requests module think that Text is a "file". This is the only way to force requests to send data as
              # multipart/form-data even if there are no attachments. Elog understands only multipart/form-data
              new_attachment_list.append(('Text', ('', message.encode('iso-8859-1'))))
      
              # Base attributes are common to all messages
              self._add_base_msg_attributes(attributes_to_edit)
      
              # Keys in attributes cannot have certain characters like whitespaces or dashes for the http request
              attributes_to_edit = _replace_special_characters_in_attribute_keys(attributes_to_edit)
      
              # All string values in the attributes must be encoded in latin1
              attributes_to_edit = _encode_values(attributes_to_edit)
      
              try:
      >           response = requests.post(self._url, data=attributes_to_edit, files=new_attachment_list,
                                           allow_redirects=False, verify=False, timeout=timeout)
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:288: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:115: in post
          return request("post", url, data=data, json=json, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde21191c0>
      request = <PreparedRequest [POST]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0dfa0>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      
      During handling of the above exception, another exception occurred:
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
      >           sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:199: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:85: in create_connection
          raise err
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      address = ('localhost', 8080), timeout = None, source_address = None
      socket_options = [(6, 1, 1)]
      
          def create_connection(
              address: tuple[str, int],
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              source_address: tuple[str, int] | None = None,
              socket_options: _TYPE_SOCKET_OPTIONS | None = None,
          ) -> socket.socket:
              """Connect to *address* and return the socket object.
      
              Convenience function.  Connect to *address* (a 2-tuple ``(host,
              port)``) and return the socket object.  Passing the optional
              *timeout* parameter will set the timeout on the socket instance
              before attempting to connect.  If no *timeout* is supplied, the
              global default timeout setting returned by :func:`socket.getdefaulttimeout`
              is used.  If *source_address* is set it must be a tuple of (host, port)
              for the socket to bind as a source address before making the connection.
              An host of '' or port 0 tells the OS to use the default.
              """
      
              host, port = address
              if host.startswith("["):
                  host = host.strip("[]")
              err = None
      
              # Using the value from allowed_gai_family() in the context of getaddrinfo lets
              # us select whether to work with IPv4 DNS records, IPv6 records, or both.
              # The original create_connection function always returns all records.
              family = allowed_gai_family()
      
              try:
                  host.encode("idna")
              except UnicodeError:
                  raise LocationParseError(f"'{host}', label empty or too long") from None
      
              for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
                  af, socktype, proto, canonname, sa = res
                  sock = None
                  try:
                      sock = socket.socket(af, socktype, proto)
      
                      # If provided, set socket level options before connecting.
                      _set_socket_options(sock, socket_options)
      
                      if timeout is not _DEFAULT_TIMEOUT:
                          sock.settimeout(timeout)
                      if source_address:
                          sock.bind(source_address)
      >               sock.connect(sa)
      E               ConnectionRefusedError: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError
      
      The above exception was the direct cause of the following exception:
      
      self = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde2b0d8b0>
      method = 'GET', url = '/demo/None', body = None
      headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate, br, zstd', 'Accept': '*/*', 'Connection': 'keep-alive', 'Cookie': 'unm=robot;upwd=me1T.2jUUqQNa1wNuey9zNBOmOa4eILOaPb.ZSZjpn4;'}
      retries = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      redirect = False, assert_same_host = False
      timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None
      release_conn = False, chunked = False, body_pos = None, preload_content = False
      decode_content = False, response_kw = {}
      parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/demo/None', query=None, fragment=None)
      destination_scheme = None, conn = None, release_this_conn = True
      http_tunnel_required = False, err = None, clean_exit = False
      
          def urlopen(  # type: ignore[override]
              self,
              method: str,
              url: str,
              body: _TYPE_BODY | None = None,
              headers: typing.Mapping[str, str] | None = None,
              retries: Retry | bool | int | None = None,
              redirect: bool = True,
              assert_same_host: bool = True,
              timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
              pool_timeout: int | None = None,
              release_conn: bool | None = None,
              chunked: bool = False,
              body_pos: _TYPE_BODY_POSITION | None = None,
              preload_content: bool = True,
              decode_content: bool = True,
              **response_kw: typing.Any,
          ) -> BaseHTTPResponse:
              """
              Get a connection from the pool and perform an HTTP request. This is the
              lowest level call for making a request, so you'll need to specify all
              the raw details.
      
              .. note::
      
                 More commonly, it's appropriate to use a convenience method
                 such as :meth:`request`.
      
              .. note::
      
                 `release_conn` will only behave as expected if
                 `preload_content=False` because we want to make
                 `preload_content=False` the default behaviour someday soon without
                 breaking backwards compatibility.
      
              :param method:
                  HTTP request method (such as GET, POST, PUT, etc.)
      
              :param url:
                  The URL to perform the request on.
      
              :param body:
                  Data to send in the request body, either :class:`str`, :class:`bytes`,
                  an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
              :param headers:
                  Dictionary of custom headers to send, such as User-Agent,
                  If-None-Match, etc. If None, pool headers are used. If provided,
                  these headers completely replace any pool-specific headers.
      
              :param retries:
                  Configure the number of retries to allow before raising a
                  :class:`~urllib3.exceptions.MaxRetryError` exception.
      
                  If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a
                  :class:`~urllib3.util.retry.Retry` object for fine-grained control
                  over different types of retries.
                  Pass an integer number to retry connection errors that many times,
                  but no other types of errors. Pass zero to never retry.
      
                  If ``False``, then retries are disabled and any exception is raised
                  immediately. Also, instead of raising a MaxRetryError on redirects,
                  the redirect response will be returned.
      
              :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
              :param redirect:
                  If True, automatically handle redirects (status codes 301, 302,
                  303, 307, 308). Each redirect counts as a retry. Disabling retries
                  will disable redirect, too.
      
              :param assert_same_host:
                  If ``True``, will make sure that the host of the pool requests is
                  consistent else will raise HostChangedError. When ``False``, you can
                  use the pool on an HTTP proxy and request foreign hosts.
      
              :param timeout:
                  If specified, overrides the default timeout for this one
                  request. It may be a float (in seconds) or an instance of
                  :class:`urllib3.util.Timeout`.
      
              :param pool_timeout:
                  If set and the pool is set to block=True, then this method will
                  block for ``pool_timeout`` seconds and raise EmptyPoolError if no
                  connection is available within the time period.
      
              :param bool preload_content:
                  If True, the response's body will be preloaded into memory.
      
              :param bool decode_content:
                  If True, will attempt to decode the body based on the
                  'content-encoding' header.
      
              :param release_conn:
                  If False, then the urlopen call will not release the connection
                  back into the pool once a response is received (but will release if
                  you read the entire contents of the response such as when
                  `preload_content=True`). This is useful if you're not preloading
                  the response's content immediately. You will need to call
                  ``r.release_conn()`` on the response ``r`` to return the connection
                  back into the pool. If None, it takes the value of ``preload_content``
                  which defaults to ``True``.
      
              :param bool chunked:
                  If True, urllib3 will send the body using chunked transfer
                  encoding. Otherwise, urllib3 will send the body using the standard
                  content-length form. Defaults to False.
      
              :param int body_pos:
                  Position to seek to in file-like body in the event of a retry or
                  redirect. Typically this won't need to be set because urllib3 will
                  auto-populate the value when needed.
              """
              parsed_url = parse_url(url)
              destination_scheme = parsed_url.scheme
      
              if headers is None:
                  headers = self.headers
      
              if not isinstance(retries, Retry):
                  retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
              if release_conn is None:
                  release_conn = preload_content
      
              # Check host
              if assert_same_host and not self.is_same_host(url):
                  raise HostChangedError(self, url, retries)
      
              # Ensure that the URL we're connecting to is properly encoded
              if url.startswith("/"):
                  url = to_str(_encode_target(url))
              else:
                  url = to_str(parsed_url.url)
      
              conn = None
      
              # Track whether `conn` needs to be released before
              # returning/raising/recursing. Update this variable if necessary, and
              # leave `release_conn` constant throughout the function. That way, if
              # the function recurses, the original value of `release_conn` will be
              # passed down into the recursive call, and its value will be respected.
              #
              # See issue #651 [1] for details.
              #
              # [1] <https://github.com/urllib3/urllib3/issues/651>
              release_this_conn = release_conn
      
              http_tunnel_required = connection_requires_http_tunnel(
                  self.proxy, self.proxy_config, destination_scheme
              )
      
              # Merge the proxy headers. Only done when not using HTTP CONNECT. We
              # have to copy the headers dict so we can safely change it without those
              # changes being reflected in anyone else's copy.
              if not http_tunnel_required:
                  headers = headers.copy()  # type: ignore[attr-defined]
                  headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
              # Must keep the exception bound to a separate variable or else Python 3
              # complains about UnboundLocalError.
              err = None
      
              # Keep track of whether we cleanly exited the except block. This
              # ensures we do proper cleanup in finally.
              clean_exit = False
      
              # Rewind body position, if needed. Record current position
              # for future rewinds in the event of a redirect/retry.
              body_pos = set_file_position(body, body_pos)
      
              try:
                  # Request a connection from the queue.
                  timeout_obj = self._get_timeout(timeout)
                  conn = self._get_conn(timeout=pool_timeout)
      
                  conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
                  # Is this a closed/new connection that requires CONNECT tunnelling?
                  if self.proxy is not None and http_tunnel_required and conn.is_closed:
                      try:
                          self._prepare_proxy(conn)
                      except (BaseSSLError, OSError, SocketTimeout) as e:
                          self._raise_timeout(
                              err=e, url=self.proxy.url, timeout_value=conn.timeout
                          )
                          raise
      
                  # If we're going to release the connection in ``finally:``, then
                  # the response doesn't need to know about the connection. Otherwise
                  # it will also try to release it and we'll have a double-release
                  # mess.
                  response_conn = conn if not release_conn else None
      
                  # Make the request on the HTTPConnection object
      >           response = self._make_request(
                      conn,
                      method,
                      url,
                      timeout=timeout_obj,
                      body=body,
                      headers=headers,
                      chunked=chunked,
                      retries=retries,
                      response_conn=response_conn,
                      preload_content=preload_content,
                      decode_content=decode_content,
                      **response_kw,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:789: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:495: in _make_request
          conn.request(
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:441: in request
          self.endheaders()
      .pixi/envs/default/lib/python3.8/http/client.py:1251: in endheaders
          self._send_output(message_body, encode_chunked=encode_chunked)
      .pixi/envs/default/lib/python3.8/http/client.py:1011: in _send_output
          self.send(msg)
      .pixi/envs/default/lib/python3.8/http/client.py:951: in send
          self.connect()
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:279: in connect
          self.sock = self._new_conn()
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>
      
          def _new_conn(self) -> socket.socket:
              """Establish a socket connection and set nodelay settings on it.
      
              :return: New socket connection.
              """
              try:
                  sock = connection.create_connection(
                      (self._dns_host, self.port),
                      self.timeout,
                      source_address=self.source_address,
                      socket_options=self.socket_options,
                  )
              except socket.gaierror as e:
                  raise NameResolutionError(self.host, self, e) from e
              except SocketTimeout as e:
                  raise ConnectTimeoutError(
                      self,
                      f"Connection to {self.host} timed out. (connect timeout={self.timeout})",
                  ) from e
      
              except OSError as e:
      >           raise NewConnectionError(
                      self, f"Failed to establish a new connection: {e}"
                  ) from e
      E           urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>: Failed to establish a new connection: [Errno 111] Connection refused
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connection.py:214: NewConnectionError
      
      The above exception was the direct cause of the following exception:
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde2b0d430>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
      >           resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:667: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/connectionpool.py:843: in urlopen
          retries = retries.increment(
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = Retry(total=0, connect=None, read=False, redirect=None, status=None)
      method = 'GET', url = '/demo/None', response = None
      error = NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>: Failed to establish a new connection: [Errno 111] Connection refused')
      _pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x7fbde2b0d8b0>
      _stacktrace = <traceback object at 0x7fbde18ec8c0>
      
          def increment(
              self,
              method: str | None = None,
              url: str | None = None,
              response: BaseHTTPResponse | None = None,
              error: Exception | None = None,
              _pool: ConnectionPool | None = None,
              _stacktrace: TracebackType | None = None,
          ) -> Self:
              """Return a new Retry object with incremented retry counters.
      
              :param response: A response object, or None, if the server did not
                  return a response.
              :type response: :class:`~urllib3.response.BaseHTTPResponse`
              :param Exception error: An error encountered during the request, or
                  None if the response was received successfully.
      
              :return: A new ``Retry`` object.
              """
              if self.total is False and error:
                  # Disabled, indicate to re-raise the error.
                  raise reraise(type(error), error, _stacktrace)
      
              total = self.total
              if total is not None:
                  total -= 1
      
              connect = self.connect
              read = self.read
              redirect = self.redirect
              status_count = self.status
              other = self.other
              cause = "unknown"
              status = None
              redirect_location = None
      
              if error and self._is_connection_error(error):
                  # Connect retry?
                  if connect is False:
                      raise reraise(type(error), error, _stacktrace)
                  elif connect is not None:
                      connect -= 1
      
              elif error and self._is_read_error(error):
                  # Read retry?
                  if read is False or method is None or not self._is_method_retryable(method):
                      raise reraise(type(error), error, _stacktrace)
                  elif read is not None:
                      read -= 1
      
              elif error:
                  # Other retry?
                  if other is not None:
                      other -= 1
      
              elif response and response.get_redirect_location():
                  # Redirect retry?
                  if redirect is not None:
                      redirect -= 1
                  cause = "too many redirects"
                  response_redirect_location = response.get_redirect_location()
                  if response_redirect_location:
                      redirect_location = response_redirect_location
                  status = response.status
      
              else:
                  # Incrementing because of a server error like a 500 in
                  # status_forcelist and the given method is in the allowed_methods
                  cause = ResponseError.GENERIC_ERROR
                  if response and response.status:
                      if status_count is not None:
                          status_count -= 1
                      cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status)
                      status = response.status
      
              history = self.history + (
                  RequestHistory(method, url, error, status, redirect_location),
              )
      
              new_retry = self.new(
                  total=total,
                  connect=connect,
                  read=read,
                  redirect=redirect,
                  status=status_count,
                  other=other,
                  history=history,
              )
      
              if new_retry.is_exhausted():
                  reason = error or ResponseError(cause)
      >           raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
      E           urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/urllib3/util/retry.py:519: MaxRetryError
      
      During handling of the above exception, another exception occurred:
      
      self = <elog.logbook.Logbook object at 0x7fbde2119c40>, msg_id = None
      timeout = None
      
          def _check_if_message_on_server(self, msg_id, timeout=None):
              """Try to load page for specific message. If there is a html tag like <td class="errormsg"> then there is no
              such message.
      
              :param msg_id: ID of message to be checked
              :params timeout: The value of timeout to be passed to the get request
              :return:
              """
      
              request_headers = dict()
              if self._user or self._password:
                  request_headers['Cookie'] = self._make_user_and_pswd_cookie()
              try:
      >           response = requests.get(self._url + str(msg_id), headers=request_headers, allow_redirects=False,
                                          verify=False, timeout=timeout)
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:581: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:73: in get
          return request("get", url, params=params, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/api.py:59: in request
          return session.request(method=method, url=url, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:589: in request
          resp = self.send(prep, **send_kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/requests/sessions.py:703: in send
          r = adapter.send(request, **kwargs)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <requests.adapters.HTTPAdapter object at 0x7fbde2b0d430>
      request = <PreparedRequest [GET]>, stream = False
      timeout = Timeout(connect=None, read=None, total=None), verify = False
      cert = None, proxies = OrderedDict()
      
          def send(
              self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None
          ):
              """Sends PreparedRequest object. Returns Response object.
      
              :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
              :param stream: (optional) Whether to stream the request content.
              :param timeout: (optional) How long to wait for the server to send
                  data before giving up, as a float, or a :ref:`(connect timeout,
                  read timeout) <timeouts>` tuple.
              :type timeout: float or tuple or urllib3 Timeout object
              :param verify: (optional) Either a boolean, in which case it controls whether
                  we verify the server's TLS certificate, or a string, in which case it
                  must be a path to a CA bundle to use
              :param cert: (optional) Any user-provided SSL certificate to be trusted.
              :param proxies: (optional) The proxies dictionary to apply to the request.
              :rtype: requests.Response
              """
      
              try:
                  conn = self.get_connection_with_tls_context(
                      request, verify, proxies=proxies, cert=cert
                  )
              except LocationValueError as e:
                  raise InvalidURL(e, request=request)
      
              self.cert_verify(conn, request.url, verify, cert)
              url = self.request_url(request, proxies)
              self.add_headers(
                  request,
                  stream=stream,
                  timeout=timeout,
                  verify=verify,
                  cert=cert,
                  proxies=proxies,
              )
      
              chunked = not (request.body is None or "Content-Length" in request.headers)
      
              if isinstance(timeout, tuple):
                  try:
                      connect, read = timeout
                      timeout = TimeoutSauce(connect=connect, read=read)
                  except ValueError:
                      raise ValueError(
                          f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, "
                          f"or a single float to set both timeouts to the same value."
                      )
              elif isinstance(timeout, TimeoutSauce):
                  pass
              else:
                  timeout = TimeoutSauce(connect=timeout, read=timeout)
      
              try:
                  resp = conn.urlopen(
                      method=request.method,
                      url=url,
                      body=request.body,
                      headers=request.headers,
                      redirect=False,
                      assert_same_host=False,
                      preload_content=False,
                      decode_content=False,
                      retries=self.max_retries,
                      timeout=timeout,
                      chunked=chunked,
                  )
      
              except (ProtocolError, OSError) as err:
                  raise ConnectionError(err, request=request)
      
              except MaxRetryError as e:
                  if isinstance(e.reason, ConnectTimeoutError):
                      # TODO: Remove this in 3.0.0: see #2811
                      if not isinstance(e.reason, NewConnectionError):
                          raise ConnectTimeout(e, request=request)
      
                  if isinstance(e.reason, ResponseError):
                      raise RetryError(e, request=request)
      
                  if isinstance(e.reason, _ProxyError):
                      raise ProxyError(e, request=request)
      
                  if isinstance(e.reason, _SSLError):
                      # This branch is for urllib3 v1.22 and later.
                      raise SSLError(e, request=request)
      
      >           raise ConnectionError(e, request=request)
      E           requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/requests/adapters.py:700: ConnectionError
      
      During handling of the above exception, another exception occurred:
      
      mock_screenshot_class = <MagicMock name='Screenshot' id='140453519990064'>
      
          @patch("slic.utils.elog.Screenshot")
          def test_screenshot(mock_screenshot_class):
              with tempfile.NamedTemporaryFile(delete=False, suffix=".png") as tmp:
                  fake_path = tmp.name
                  tmp.write(b"fake image data")
      
              mock_instance = mock_screenshot_class.return_value
              mock_instance.shoot.return_value = [fake_path]
      
              elog = get_test_elog()
      
              test_msg = "SCREENSHOT_INTEGRATION_TEST_MSG_456"
      >       entry_id = elog.screenshot(message=test_msg)
      
      tests/test_utils_elog.py:143: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      slic/utils/elog.py:21: in screenshot
          self.post(message, **kwargs)
      slic/utils/elog.py:16: in post
          return self._log.post(*args, **kwargs)
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:307: in post
          self._check_if_message_on_server(msg_id)  # raises exceptions if no message or no response from server
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      
      self = <elog.logbook.Logbook object at 0x7fbde2119c40>, msg_id = None
      timeout = None
      
          def _check_if_message_on_server(self, msg_id, timeout=None):
              """Try to load page for specific message. If there is a html tag like <td class="errormsg"> then there is no
              such message.
      
              :param msg_id: ID of message to be checked
              :params timeout: The value of timeout to be passed to the get request
              :return:
              """
      
              request_headers = dict()
              if self._user or self._password:
                  request_headers['Cookie'] = self._make_user_and_pswd_cookie()
              try:
                  response = requests.get(self._url + str(msg_id), headers=request_headers, allow_redirects=False,
                                          verify=False, timeout=timeout)
      
                  # If there is no message code 200 will be returned (OK) and _validate_response will not recognise it
                  # but there will be some error in the html code.
                  resp_message, resp_headers, resp_msg_id = _validate_response(response)
                  # If there is no message, code 200 will be returned (OK) but there will be some error indication in
                  # the html code.
                  if re.findall('<td.*?class="errormsg".*?>.*?</td>',
                                resp_message.decode('utf-8', 'ignore'),
                                flags=re.DOTALL):
                      raise LogbookInvalidMessageID('Message with ID: ' + str(msg_id) + ' does not exist on logbook.')
      
              except requests.Timeout as e:
                  # Catch here a timeout o the post request.
                  # Raise the logbook exception and let the user handle it
                  raise LogbookServerTimeout('{0} method cannot be completed because of a network timeout:\n' +
                                             '{1}'.format(sys._getframe().f_code.co_name, e))
      
              except requests.RequestException as e:
      >           raise LogbookServerProblem('No response from the logbook server.\nDetails: ' + '{0}'.format(e))
      E           elog.logbook_exceptions.LogbookServerProblem: No response from the logbook server.
      E           Details: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /demo/None (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fbde2b0d280>: Failed to establish a new connection: [Errno 111] Connection refused'))
      
      .pixi/envs/default/lib/python3.8/site-packages/elog/logbook.py:601: LogbookServerProblem
      

      📌 Teardown phase

      duration:

      0.0002595130354166031
      

      outcome:

      passed
      

📚 Collected files

(1 tests)
    • Outcome: passed
    • result:
    -   nodeid: tests/test_utils_elog.py
      type: Module
    
tests (1 tests)
  • tests/test_utils_elog.py
    • Outcome: passed
    • result:
    -   nodeid: tests/test_utils_elog.py::test_post_local
      type: Function
      lineno: 29
    -   nodeid: tests/test_utils_elog.py::test_get_default_elog_instance_with_direct_password_and_real_check
      type: Function
      lineno: 52
    -   nodeid: tests/test_utils_elog.py::test_get_default_elog_instance_asks_password_and_opens
      type: Function
      lineno: 65
    -   nodeid: tests/test_utils_elog.py::test_get_default_elog_with_path_home
      type: Function
      lineno: 83
    -   nodeid: tests/test_utils_elog.py::test_post
      type: Function
      lineno: 114
    -   nodeid: tests/test_utils_elog.py::test_screenshot
      type: Function
      lineno: 130
    

⚠️ Warnings

Warnings nº1
message: invalid escape sequence \-
category: DeprecationWarning
when: collect
filename: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/bsread/h5.py
lineno: 207
Warnings nº2
message: The module numpy.dual is deprecated.  Instead of using dual, use the functions directly from numpy or scipy.
category: DeprecationWarning
when: collect
filename: /workspace/tligui_y/slic/.pixi/envs/default/lib/python3.8/site-packages/scipy/fft/__init__.py
lineno: 97