Mender Production update from v2.2.0 to v2.3.0

Hello,
after reading the update document from the mender-production-setup branch of mender 2.2.0 to update to version 2.3.0, I’m not sure how to do it, it seems that git fetch origin --tags doesn’t do its work, having done git remote update origin.
Can you tell me what I am doing wrong?
I am sorry if the question is very basic, but I am not yet so used to git to make these changes.

Thanks for your help :slight_smile:

1 Like

Hi @blackdevice and welcome to Mender Hub :slight_smile:

I’m afraid I’m not quite following though. What do you mean by git fetch origin does not do it’s work?

What is the output of git tag?

Cheers
Ole

Hi,
thanks for your reply,

I mean that when I run:

git remote update origin 
&
git fetch origin --tags

It does not have this output, as in the tutorial:

Fetching origin  
remote: Counting objects: 367, done.  
remote: Compressing objects: 100% (31/31), done.  
remote: Total 367 (delta 134), reused 122 (delta 122), pack-reused 214  
Receiving objects: 100% (367/367), 83.55 KiB | 0 bytes/s, done.  
Resolving deltas: 100% (214/214), completed with 42 local objects.  
From https://github.com/mendersoftware/integration  
   02cd118..75b7831  2.3.0b1      -> origin/2.3.0b1
   06f3212..e9e5df4  2.3.0b1     -> origin/2.3.0b1  

It also doesn’t give any error,
But when I want to do the integration, whit git merge 2.3.0b1 it doesn’t happen.

Something I’m missing along the way … Excuse me again, since I don’t know much about git …

info:

git tag output

1.0.0
1.0.0-build1
1.0.1
1.1.0
1.1.0b1
1.1.1
1.1.2
1.1.3
1.2.0
1.2.0b1
1.2.1
1.2.2
1.3.0
1.3.0b1
1.3.1
1.4.0
1.4.0b1
1.4.1
1.4.2
1.5.0
1.5.0b1
1.5.1
1.6.0
1.6.0b1
1.6.1
1.7.0
1.7.0b1
1.7.1
2.0.0
2.0.0b1
2.0.1
2.1.0
2.1.0b1
2.1.1
2.2.0
2.2.0b1
2.2.1
2.3.0b1
integration-0.1

What doesn’t happen explicitly?

Also, let’s verify which branch you are actually on. Please run

git log --max-count=1

and show the output

I currently have the production branch active:

commit b426f2c6915dde93007d4b33ed5af874bc6ed7d5 (HEAD -> my-production-setup)

when I do merge:

git merge 2.3.0b1
Auto-merging testutils / infra / smtpd_mock.py
Removing testutils / infra / docker.py
Auto-merging testutils / infra / container_manager / __ init__.py
Auto-merging tests / tests / test_demo_artifact.py
CONFLICT (content): Merger conflict in tests / tests / test_demo_artifact.py
Deleting tests / tests / test_decommission.py
Deleting tests / tests / test_create_organization.py
Deleting tests / tests / driver.py
Deleting tests / tests / amazon_s3 / test_01_s3.py
Deleting tests / common_docker.py
CONFLICT (modify / delete): production / config / enterprise.yml.template deleted in HEAD and modified in 2.3.0b1. Version 2.3.0b1 of production / config / enterprise.yml.template is missing in the tree.
Auto-merging other-components.yml
CONFLICT (content): Merger conflict in other-components.yml
Auto-merging extra / release_tool.py
CONFLICT (content): Merge conflict in extra / release_tool.py
Removing extra / kubernetes / templates / service.template.yml
Removing extra / kubernetes / mongo-mender-useradm-service.yaml
Removing extra / kubernetes / mongo-mender-inventory-service.yaml
Removing extra / kubernetes / mongo-mender-device-auth-service.yaml
Removing extra / kubernetes / mongo-deployments-service.yaml
Removing extra / kubernetes / minio-service.yaml
Removing extra / kubernetes / minio-deployment.yaml
Removing extra / kubernetes / mender-useradm-service.yaml
Removing extra / kubernetes / mender-useradm-deployment.yaml
Removing extra / kubernetes / mender-mongo-useradm-deployment.yaml
Removing extra / kubernetes / mender-mongo-inventory-deployment.yaml
Removing extra / kubernetes / mender-mongo-device-auth-deployment.yaml
Removing extra / kubernetes / mender-mongo-deployments-deployment.yaml
Removing extra / kubernetes / mender-inventory-service.yaml
Removing extra / kubernetes / mender-inventory-deployment.yaml
Removing extra / kubernetes / mender-gui-service.yaml
Removing extra / kubernetes / mender-gui-deployment.yaml
Removing extra / kubernetes / mender-etcd-deployment.yaml
Removing extra / kubernetes / mender-device-auth-service.yaml
Removing extra / kubernetes / mender-device-auth-deployment.yaml
Removing extra / kubernetes / mender-deployments-service.yaml
Removing extra / kubernetes / mender-deployments-deployment.yaml
Removing extra / kubernetes / mender-client-deployment.yaml
Removing extra / kubernetes / mender-api-gateway-service.yaml
Removing extra / kubernetes / mender-api-gateway-deployment.yaml
Auto-merging docker-compose.yml
CONFLICT (content): Merger conflict in docker-compose.yml
Auto-merging docker-compose.testing.enterprise.yml
Removing docker-compose.storage.s3.yml
Auto-merging docker-compose.enterprise.yml
CONFLICT (content): Merger conflict in docker-compose.enterprise.yml
Auto-merging docker-compose.docker-client.yml
CONFLICT (content): Merger conflict in docker-compose.docker-client.yml
Auto-merging docker-compose.client.yml
CONFLICT (content): Merger conflict in docker-compose.client.yml
Auto-merging docker-compose.client.rofs.yml
CONFLICT (content): Merger conflict in docker-compose.client.rofs.yml
Auto-merging demo
Removing conductor / elasticsearch / elasticsearch.yml
Auto-merging component-maps.yml
Deleting .travis.yml
Auto Fusion failed; fix the conflicts and then commit with the result.

It seems to me like it works just fine

Have a little read through here: https://www.atlassian.com/git/tutorials/using-branches/git-merge :slight_smile:

Oh, I think this will help me a lot!
Thank you very much for your help, I didn’t know if those conflicts were normal, I will try to resolve them and update the result here :slight_smile:

Thanks again for your time and help.

1 Like

Great!

Also, you can have a look at this: Upgrading Mender Server from 2.0 to 2.1

1 Like

Ok, I think I have solved all the problems of git merge, now I have a new issue to ask :slight_smile:

Once I have:
production # git merge 2.3.0b1
It’s already updated.

I execute the following orders:

/ production # ./run pull
Pulling minio ... done
Pulling mender-redis ... done
Pulling storage-proxy ... done
Pulling mender-gui ... done
Pulling mender-mongo ... done
Pulling mender-inventory ... done
Pulling mender-useradm ... done
Pulling mender-create-artifact-worker ... done
Pulling mender-workflows-server ... done
Pulling mender-deployments ... done
Pulling mender-elasticsearch ... done
Pulling mender-driver ... done
Pulling mender-device-auth ... done
Pulling mender-api-gateway ... done

/ production # ./run stop
Stopping menderproduction_mender-deployments_1 ... done
Stopping menderproduction_storage-proxy_1 ... done
Stopping menderproduction_mender-workflows-server_1 ... done
Stopping menderproduction_mender-create-artifact-worker_1 ... done
Stopping menderproduction_mender-inventory_1 ... done
Stopping menderproduction_mender-useradm_1 ... done
Stopping menderproduction_mender-redis_1 ... done
Stopping menderproduction_minio_1 ... done
Stopping menderproduction_mender-gui_1 ... done
Stopping menderproduction_mender-mongo_1 ... done

/ production # ./run rm
Going to remove menderproduction_mender-deployments_1, menderproduction_storage-proxy_1, menderproduction_mender-workflows-server_1, menderproduction_mender-create-artifact-worker_1, menderproduction_mender-inventory_1, menderproduction_mender-useradm_1, menderproduction_mender-redis_1, menderproduction_minio_1, menderproduction_mender-gui_1, menderproduction_mender-elasticsearch_1, menderproduction_mender-mongo_1
Are you sure? [yN] and
Removing menderproduction_mender-deployments_1 ... done
Removing menderproduction_storage-proxy_1 ... done
Removing menderproduction_mender-workflows-server_1 ... done
Removing menderproduction_mender-create-artifact-worker_1 ... done
Removing menderproduction_mender-inventory_1 ... done
Removing menderproduction_mender-useradm_1 ... done
Removing menderproduction_mender-redis_1 ... done
Removing menderproduction_minio_1 ... done
Removing menderproduction_mender-gui_1 ... done
Removing menderproduction_mender-elasticsearch_1 ... done
Removing menderproduction_mender-mongo_1 ... done

/ production # ./run up -d
Creating menderproduction_mender-elasticsearch_1 ... error
Creating menderproduction_mender-mongo_1 ...
Creating menderproduction_mender-gui_1 ...
Creating menderproduction_mender-redis_1 ...
Creating menderproduction_minio_1 ...

ERROR: for menderproduction_mender-elasticsearch_1 Cannot start service mender-elasticsearch: OCI runtime create failed: container_linux.go: 346: starting container process caused "process_linux.Creating menderproduction_mender-mongo_1 ... done
Creating menderproduction_mender-gui_1 ... done
Creating menderproduction_mender-redis_1 ... done
Creating menderproduction_minio_1 ... done
Creating menderproduction_mender-inventory_1 ... done
Creating menderproduction_mender-create-artifact-worker_1 ... done
Creating menderproduction_mender-workflows-server_1 ... done
Creating menderproduction_mender-useradm_1 ... done
Creating menderproduction_storage-proxy_1 ... done
Creating menderproduction_mender-deployments_1 ... done

ERROR: for mender-elasticsearch Cannot start service mender-elasticsearch: OCI runtime create failed: container_linux.go: 346: starting container process caused "process_linux.go: 449: container init caused \" rootfs_linux.go: 58: mounting \\\ "/home/antoniocotos/deploy/mender-server/conductor/elasticsearch/elasticsearch.yml \\\" to rootfs \\\ "/ var / lib / docker / overlay2 / 5c7e341fb9f7411683f2b6422e11f54cd369ae853919cf48ef75ea80 / mercury" /var/lib/docker/overlay2/5c7e341fb9f7411683f2b6422e11f54cd369ae853919cf48ef75ea806454e022/merged/usr/share/elasticsearch/config/elasticsearch.yml \\\ "caused \\\" not a directory \ unknown: Are you trying to: mount a directory onto a file (or vice-versa)? Check if the specified host path exists and is the expected type
ERROR: Encountered errors while bringing up the project.

What could be generating that error? It is not permissions …

I have solved this by modifying the file:
/production/config/prod.yml:

     mender-elasticsearch:
         volumes:
             - mender-elasticsearch-db: / usr / share / elasticsearch / data: rw
             # - ./conductor/elasticsearch/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml

Now, everything is back to UP, but the version of the GUI is still 2.2.0 …?

I do not know if it is a problem related to the previous query, but I have noticed the following:
If I try to upload a .sh file as an artifact with the new system to put it on a device path, the upload reaches 100% but it stops there, and I can’t see it later in the releases tab to deploy with it.

I’m glad it worked out for you :slight_smile:

Could you share your `docker-compose.yml’ file, after the migration please?

Hi, of course ,
right now everything that was in 2.2.0 works ok, the version number has already been updated in the GUI, but it doesn’t let me upload a .sh file for example of the new functionality to generate an artifact, it loads the 100%, and stays there.

The docker composes after the update:

mender-server$ cat docker-compose.yml
version: '2.1'
services:

    #
    # mender-deployments
    #
    mender-deployments:
        image: mendersoftware/deployments:1.9.0b1
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        depends_on:
            - mender-mongo
            - storage-proxy

    #
    # mender-gui
    #
    mender-gui:
        image: mendersoftware/gui:2.3.0b1
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        environment:
            - GATEWAY_IP
            - INTEGRATION_VERSION
            - MENDER_ARTIFACT_VERSION
            - MENDER_VERSION
            - MENDER_DEB_PACKAGE_VERSION

    #
    # mender-api-gateway
    #
    mender-api-gateway:
        image: mendersoftware/api-gateway:2.1.0b1
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        # critical - otherwise nginx may not detect
        # these servers and exits with 'upstream server not found'
        depends_on:
            - mender-device-auth
            - mender-deployments
            - mender-gui
            - mender-useradm
            - mender-inventory

    #
    # mender-device-auth
    #
    mender-device-auth:
        image: mendersoftware/deviceauth:2.2.0b1
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        depends_on:
            - mender-mongo
            - mender-conductor

    #
    # mender-inventory
    #
    mender-inventory:
        image: mendersoftware/inventory:1.7.0b1
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        depends_on:
            - mender-mongo

    #
    # mender-useradm
    #
    mender-useradm:
        image: mendersoftware/useradm:1.10.0b1
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        depends_on:
            - mender-mongo

    #
    # mender-workflows
    #
    mender-workflows-server:
        image: mendersoftware/workflows:1.0.0b1
        environment:
            WORKFLOWS_MONGO_URL: mongodb://mender-mongo:27017
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        depends_on:
            - mender-mongo

    #
    # mender-create-artifact-worker
    #
    mender-create-artifact-worker:
        image: mendersoftware/create-artifact-worker:1.0.0b1
        environment:
            WORKFLOWS_MONGO_URL: mongodb://mender-mongo:27017
        extends:
            file: common.yml
            service: mender-base
        environment:
            - WORKFLOWS_MONGO_URL=mongodb://mongo-workflows:27017
            - CREATE_ARTIFACT_GATEWAY_URL=https://mender-api-gateway
            - CREATE_ARTIFACT_DEPLOYMENTS_URL=http://mender-deployments:8080
        networks:
            - mender
        depends_on:
            - mender-mongo

    mender-mongo:
        image: mongo:3.6
        extends:
            file: common.yml
            service: mender-base
        networks:
            mender:
                aliases:
                    - mongo-tenantadm
                    - mongo-deployments
                    - mongo-device-auth
                    - mongo-inventory
                    - mongo-useradm
                    - mongo-workflows

    mender-conductor:
        image: mendersoftware/mender-conductor:1.6.0b1
        depends_on:
            - mender-elasticsearch
            - mender-redis
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender

    mender-elasticsearch:
        image: elasticsearch:6.8.6
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender
        environment:
            - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
            - discovery.type=single-node
            - network.host=0.0.0.0

    mender-redis:
        image: redis:5-alpine3.8
        extends:
            file: common.yml
            service: mender-base
        networks:
            - mender

networks:
    mender:

Thanks!

Ok, the problem with the containers was that the statements in the docker compose and the prod.yml for the new elements were missing: workflows and create-artifact-worker, I guess something went wrong in the fetch with branch 2.3.0b1, once corrected after comparing with the github prod.yml files of the original branch, I was able to lift the containers properly.

Now the problem we detected is the following:

Everything works fine, except creating artifacts using files that are uploaded to the gui.

I have these two containers in healthy:

1-mender conductor

attached log.

mender-conductor log

 
Feb 25, 2020 12:37:03 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Root resource classes found:
  class com.netflix.conductor.server.resources.TaskResource
  class io.swagger.jaxrs.listing.ApiListingResource
  class com.netflix.conductor.server.resources.AdminResource
  class com.netflix.conductor.server.resources.WorkflowBulkResource
  class com.netflix.conductor.server.resources.EventResource
  class com.netflix.conductor.server.resources.MetadataResource
  class com.netflix.conductor.server.resources.WorkflowResource
  class com.netflix.conductor.server.resources.HealthCheckResource
  class io.swagger.jaxrs.listing.AcceptHeaderApiListingResource

Feb 25, 2020 12:37:03 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Provider classes found:
  class com.netflix.conductor.server.resources.GenericExceptionMapper
  class io.swagger.jaxrs.listing.SwaggerSerializers
  class com.netflix.conductor.server.resources.ApplicationExceptionMapper
  class com.netflix.conductor.server.resources.ValidationExceptionMapper
  class com.netflix.conductor.server.resources.WebAppExceptionMapper

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class

Feb 25, 2020 12:37:03 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.19.1 03/11/2016 02:42 PM'

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.ValidationExceptionMapper to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.ApplicationExceptionMapper to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.GenericExceptionMapper to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.WebAppExceptionMapper to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.EventResource to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.WorkflowResource to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding com.netflix.conductor.server.resources.HealthCheckResource to GuiceInstantiatedComponentProvider

Feb 25, 2020 12:37:03 AM com.sun.jersey.spi.inject.Errors processErrorMessages
WARNING: The following warnings have been detected with resource and/or provider classes:
  WARNING: Return type java.util.Map<java.lang.String, ?> of method public java.util.Map<java.lang.String, ?> com.netflix.conductor.server.resources.EventResource.getEventQueues(boolean) is not resolvable to a concrete type

[2020-02-25 00:37:03,736] [main] INFO  org.eclipse.jetty.server.handler.ContextHandler  - Started o.e.j.s.ServletContextHandler@5f08fe00{/,null,AVAILABLE}

[2020-02-25 00:37:03,745] [main] INFO  org.eclipse.jetty.server.AbstractConnector  - Started ServerConnector@622d7e4{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}

[2020-02-25 00:37:03,745] [main] INFO  org.eclipse.jetty.server.Server  - Started @38346ms
Started server on http://localhost:8080/
-- loading task /srv/tasks/create_device_inventory.json
INFO:root:found 1 tasks
INFO:root:loading task create_device_inventory
INFO:root:task create_device_inventory successfully loaded
-- loading task /srv/tasks/delete_device_deployments.json
INFO:root:found 1 tasks
INFO:root:loading task delete_device_deployments
INFO:root:task delete_device_deployments successfully loaded
-- loading task /srv/tasks/delete_device_inventory.json
INFO:root:found 1 tasks
INFO:root:loading task delete_device_inventory
INFO:root:task delete_device_inventory successfully loaded
-- loading task /srv/tasks/send_email.json
INFO:root:found 1 tasks
INFO:root:loading task send_email
INFO:root:task send_email successfully loaded
-- loading workflow /srv/workflows/decommission_device.json
INFO:root:found workflow decommission_device version 4
INFO:root:workflow decommission_device version 4 already present
-- loading workflow /srv/workflows/provision_device.json
INFO:root:found workflow provision_device version 1
INFO:root:workflow provision_device version 1 already present
[2020-02-25 00:38:04,895] [qtp129958347-49] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - A new instance of workflow: provision_device created with id: 5c5df716-55d8-4df4-a0c8-d3c8af30535f
[2020-02-25 00:38:04,982] [qtp129958347-49] INFO  com.netflix.conductor.core.execution.tasks.SystemTaskWorkerCoordinator  - Adding the queue for system task: DECISION
[2020-02-25 00:38:04,983] [qtp129958347-49] INFO  com.netflix.conductor.core.execution.tasks.SystemTaskWorkerCoordinator  - Adding the queue for system task: FORK
[2020-02-25 00:38:04,983] [qtp129958347-49] INFO  com.netflix.conductor.core.execution.tasks.SystemTaskWorkerCoordinator  - Adding the queue for system task: JOIN
[2020-02-25 00:38:04,983] [qtp129958347-49] INFO  com.netflix.conductor.core.execution.tasks.SystemTaskWorkerCoordinator  - Adding the queue for system task: EXCLUSIVE_JOIN
[2020-02-25 00:38:05,549] [system-task-worker-0] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Task: Task{taskType='HTTP', status=SCHEDULED, inputData={http_request={uri=http://mender-inventory:8080/api/internal/v1/inventory/devices, method=POST, contentType=application/json, body={id=27a61be4-6a9a-41d8-a7ad-60bf81420c57, id_data=, IdDataStruct=null, IdDataSha256=null, decommissioning=false, created_ts=0001-01-01T00:00:00Z, updated_ts=0001-01-01T00:00:00Z, auth_sets=null}, headers={X-MEN-RequestID=a846cc02-8808-470f-a292-45c56c389e72, Authorization=Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1ODMxOTUzMDksImp0aSI6IjgyM2VhYWFhLTY1ZDEtNDE3Mi1hMmY5LTY5Y2U1NGI2MTJiYSIsImlzcyI6Im1lbmRlci51c2VyYWRtIiwic3ViIjoiMTM3ZjVhYTItOTMwMi00MjNhLTk1YmItZjdjZDhiZjQ4Mjg1Iiwic2NwIjoibWVuZGVyLioiLCJtZW5kZXIudXNlciI6dHJ1ZX0.RRoFiF6HMNLfh_KNeur9wCO_2eJXLP7XhN2X5517KzhZ1NRJrBFzm8EmiuRSH6f-yO2tB5rhNHFg-Oxd2xWknIrDwZTwbG2GbnuGviaR9LkxDgmxxsNgp792jwJMWzRpToOyELglhgcmNZ9uZjIpucMHmzR7QbHYqC_DvMsdc1MvLq_2eEzQJl_SR3xU-U6Bc4cGNr8R8kVJRzrDfW2y4t42LTAbVK1oIRB-S7uvFI4fNumTwMObJxrNY46LC3kFBtw5qUxm6pCV1GbGFMb8zzAozNO5PuXvrLDMhwIrBu7kpmf_3urBcRdXqTUjU8Ng1hyVPi5Yq3l3lPJnsleazyGac3qy_O_oZqU8215Bf3JQUhZSYQwVskV9ZRpreMvP5DLkeVRX1_tlpJuF6WeklMCEfN28out3uQMjTV_8TwM0FrOAur7y5O81Ad7k1D-syjCP7qmvflnfURQz3cGXc3miw9S_1ZA8tMFWZe5P2X3p7csd5vXOresBeuyRCsqU}, connectionTimeOut=1000, readTimeOut=1000}, asyncComplete=false}, referenceTaskName='create_device_inventory', retryCount=0, seq=1, correlationId='null', pollCount=0, taskDefName='create_device_inventory', scheduledTime=1582591084977, startTime=0, endTime=0, updateTime=1582591085129, startDelayInSeconds=0, retriedTaskId='null', retried=false, executed=false, callbackFromWorker=true, responseTimeoutSeconds=0, workflowInstanceId='5c5df716-55d8-4df4-a0c8-d3c8af30535f', workflowType='provision_device', taskId='a521f08d-0c63-48d2-93ff-444908b0c8a6', reasonForIncompletion='null', callbackAfterSeconds=0, workerId='null', outputData={}, workflowTask=create_device_inventory/create_device_inventory, domain='null', inputMessage='null', outputMessage='null', rateLimitPerFrequency=0, rateLimitFrequencyInSeconds=1, externalInputPayloadStoragePath='null', externalOutputPayloadStoragePath='null'} fetched from execution DAO for taskId: a521f08d-0c63-48d2-93ff-444908b0c8a6
[2020-02-25 00:38:05,567] [system-task-worker-0] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Executing HTTP/a521f08d-0c63-48d2-93ff-444908b0c8a6-SCHEDULED
[2020-02-25 00:38:06,068] [system-task-worker-0] INFO  com.netflix.conductor.contribs.http.HttpTask  - response 201, null
[2020-02-25 00:38:06,192] [system-task-worker-0] INFO  com.netflix.dyno.queues.redis.RedisDynoQueue  - com.netflix.dyno.queues.redis.RedisDynoQueue is ready to serve provision_device:device_accepted
[2020-02-25 00:38:06,300] [system-task-worker-0] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Done Executing HTTP/a521f08d-0c63-48d2-93ff-444908b0c8a6-COMPLETED op={response={headers={X-Men-Requestid=[a846cc02-8808-470f-a292-45c56c389e72], Vary=[Accept-Encoding], Content-Length=[0], Date=[Tue, 25 Feb 2020 00:38:06 GMT], X-Inventory-Version=[unknown], Location=[devices/27a61be4-6a9a-41d8-a7ad-60bf81420c57], Content-Type=[application/json; charset=utf-8]}, reasonPhrase=Created, body=null, statusCode=201}}
[2020-02-25 00:38:10,226] [qtp129958347-49] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - A new instance of workflow: provision_device created with id: f78deac2-21f3-4694-9571-69755a39c4aa
[2020-02-25 00:38:10,568] [system-task-worker-1] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Task: Task{taskType='HTTP', status=SCHEDULED, inputData={http_request={uri=http://mender-inventory:8080/api/internal/v1/inventory/devices, method=POST, contentType=application/json, body={id=4fcd8b3d-9863-41d4-a923-537c777f68ec, id_data=, IdDataStruct=null, IdDataSha256=null, decommissioning=false, created_ts=0001-01-01T00:00:00Z, updated_ts=0001-01-01T00:00:00Z, auth_sets=null}, headers={X-MEN-RequestID=9f656c4b-cc16-477f-9e60-cfc16636c325, Authorization=Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1ODMxOTUzMDksImp0aSI6IjgyM2VhYWFhLTY1ZDEtNDE3Mi1hMmY5LTY5Y2U1NGI2MTJiYSIsImlzcyI6Im1lbmRlci51c2VyYWRtIiwic3ViIjoiMTM3ZjVhYTItOTMwMi00MjNhLTk1YmItZjdjZDhiZjQ4Mjg1Iiwic2NwIjoibWVuZGVyLioiLCJtZW5kZXIudXNlciI6dHJ1ZX0.RRoFiF6HMNLfh_KNeur9wCO_2eJXLP7XhN2X5517KzhZ1NRJrBFzm8EmiuRSH6f-yO2tB5rhNHFg-Oxd2xWknIrDwZTwbG2GbnuGviaR9LkxDgmxxsNgp792jwJMWzRpToOyELglhgcmNZ9uZjIpucMHmzR7QbHYqC_DvMsdc1MvLq_2eEzQJl_SR3xU-U6Bc4cGNr8R8kVJRzrDfW2y4t42LTAbVK1oIRB-S7uvFI4fNumTwMObJxrNY46LC3kFBtw5qUxm6pCV1GbGFMb8zzAozNO5PuXvrLDMhwIrBu7kpmf_3urBcRdXqTUjU8Ng1hyVPi5Yq3l3lPJnsleazyGac3qy_O_oZqU8215Bf3JQUhZSYQwVskV9ZRpreMvP5DLkeVRX1_tlpJuF6WeklMCEfN28out3uQMjTV_8TwM0FrOAur7y5O81Ad7k1D-syjCP7qmvflnfURQz3cGXc3miw9S_1ZA8tMFWZe5P2X3p7csd5vXOresBeuyRCsqU}, connectionTimeOut=1000, readTimeOut=1000}, asyncComplete=false}, referenceTaskName='create_device_inventory', retryCount=0, seq=1, correlationId='null', pollCount=0, taskDefName='create_device_inventory', scheduledTime=1582591090230, startTime=0, endTime=0, updateTime=1582591090293, startDelayInSeconds=0, retriedTaskId='null', retried=false, executed=false, callbackFromWorker=true, responseTimeoutSeconds=0, workflowInstanceId='f78deac2-21f3-4694-9571-69755a39c4aa', workflowType='provision_device', taskId='1f5f58f1-2343-4d61-943c-72324389a22b', reasonForIncompletion='null', callbackAfterSeconds=0, workerId='null', outputData={}, workflowTask=create_device_inventory/create_device_inventory, domain='null', inputMessage='null', outputMessage='null', rateLimitPerFrequency=0, rateLimitFrequencyInSeconds=1, externalInputPayloadStoragePath='null', externalOutputPayloadStoragePath='null'} fetched from execution DAO for taskId: 1f5f58f1-2343-4d61-943c-72324389a22b
[2020-02-25 00:38:10,570] [system-task-worker-1] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Executing HTTP/1f5f58f1-2343-4d61-943c-72324389a22b-SCHEDULED
[2020-02-25 00:38:10,725] [system-task-worker-1] INFO  com.netflix.conductor.contribs.http.HttpTask  - response 201, null
[2020-02-25 00:38:10,898] [system-task-worker-1] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Done Executing HTTP/1f5f58f1-2343-4d61-943c-72324389a22b-COMPLETED op={response={headers={X-Men-Requestid=[9f656c4b-cc16-477f-9e60-cfc16636c325], Vary=[Accept-Encoding], Content-Length=[0], Date=[Tue, 25 Feb 2020 00:38:10 GMT], X-Inventory-Version=[unknown], Location=[devices/4fcd8b3d-9863-41d4-a923-537c777f68ec], Content-Type=[application/json; charset=utf-8]}, reasonPhrase=Created, body=null, statusCode=201}}
[2020-02-25 00:38:15,057] [qtp129958347-45] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - A new instance of workflow: provision_device created with id: a40c8098-b2c4-4b3b-a867-58f77ff8b6e0
[2020-02-25 00:38:15,338] [system-task-worker-2] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Task: Task{taskType='HTTP', status=SCHEDULED, inputData={http_request={uri=http://mender-inventory:8080/api/internal/v1/inventory/devices, method=POST, contentType=application/json, body={id=e49198d4-9285-435e-b306-e651dd36910f, id_data=, IdDataStruct=null, IdDataSha256=null, decommissioning=false, created_ts=0001-01-01T00:00:00Z, updated_ts=0001-01-01T00:00:00Z, auth_sets=null}, headers={X-MEN-RequestID=3557bc4f-e508-41b5-8dcc-4ca2961c2764, Authorization=Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1ODMxOTUzMDksImp0aSI6IjgyM2VhYWFhLTY1ZDEtNDE3Mi1hMmY5LTY5Y2U1NGI2MTJiYSIsImlzcyI6Im1lbmRlci51c2VyYWRtIiwic3ViIjoiMTM3ZjVhYTItOTMwMi00MjNhLTk1YmItZjdjZDhiZjQ4Mjg1Iiwic2NwIjoibWVuZGVyLioiLCJtZW5kZXIudXNlciI6dHJ1ZX0.RRoFiF6HMNLfh_KNeur9wCO_2eJXLP7XhN2X5517KzhZ1NRJrBFzm8EmiuRSH6f-yO2tB5rhNHFg-Oxd2xWknIrDwZTwbG2GbnuGviaR9LkxDgmxxsNgp792jwJMWzRpToOyELglhgcmNZ9uZjIpucMHmzR7QbHYqC_DvMsdc1MvLq_2eEzQJl_SR3xU-U6Bc4cGNr8R8kVJRzrDfW2y4t42LTAbVK1oIRB-S7uvFI4fNumTwMObJxrNY46LC3kFBtw5qUxm6pCV1GbGFMb8zzAozNO5PuXvrLDMhwIrBu7kpmf_3urBcRdXqTUjU8Ng1hyVPi5Yq3l3lPJnsleazyGac3qy_O_oZqU8215Bf3JQUhZSYQwVskV9ZRpreMvP5DLkeVRX1_tlpJuF6WeklMCEfN28out3uQMjTV_8TwM0FrOAur7y5O81Ad7k1D-syjCP7qmvflnfURQz3cGXc3miw9S_1ZA8tMFWZe5P2X3p7csd5vXOresBeuyRCsqU}, connectionTimeOut=1000, readTimeOut=1000}, asyncComplete=false}, referenceTaskName='create_device_inventory', retryCount=0, seq=1, correlationId='null', pollCount=0, taskDefName='create_device_inventory', scheduledTime=1582591095062, startTime=0, endTime=0, updateTime=1582591095111, startDelayInSeconds=0, retriedTaskId='null', retried=false, executed=false, callbackFromWorker=true, responseTimeoutSeconds=0, workflowInstanceId='a40c8098-b2c4-4b3b-a867-58f77ff8b6e0', workflowType='provision_device', taskId='9c9d42b7-e0cf-428c-a54a-7e24590b2016', reasonForIncompletion='null', callbackAfterSeconds=0, workerId='null', outputData={}, workflowTask=create_device_inventory/create_device_inventory, domain='null', inputMessage='null', outputMessage='null', rateLimitPerFrequency=0, rateLimitFrequencyInSeconds=1, externalInputPayloadStoragePath='null', externalOutputPayloadStoragePath='null'} fetched from execution DAO for taskId: 9c9d42b7-e0cf-428c-a54a-7e24590b2016
[2020-02-25 00:38:15,352] [system-task-worker-2] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Executing HTTP/9c9d42b7-e0cf-428c-a54a-7e24590b2016-SCHEDULED
[2020-02-25 00:38:15,498] [system-task-worker-2] INFO  com.netflix.conductor.contribs.http.HttpTask  - response 201, null
[2020-02-25 00:38:15,687] [system-task-worker-2] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Done Executing HTTP/9c9d42b7-e0cf-428c-a54a-7e24590b2016-COMPLETED op={response={headers={X-Men-Requestid=[3557bc4f-e508-41b5-8dcc-4ca2961c2764], Vary=[Accept-Encoding], Content-Length=[0], Date=[Tue, 25 Feb 2020 00:38:15 GMT], X-Inventory-Version=[unknown], Location=[devices/e49198d4-9285-435e-b306-e651dd36910f], Content-Type=[application/json; charset=utf-8]}, reasonPhrase=Created, body=null, statusCode=201}}
[2020-02-25 00:44:33,631] [qtp129958347-45] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - A new instance of workflow: provision_device created with id: 68ac03c3-c18c-4123-9875-dbcc69d03dcb
[2020-02-25 00:44:34,046] [system-task-worker-3] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Task: Task{taskType='HTTP', status=SCHEDULED, inputData={http_request={uri=http://mender-inventory:8080/api/internal/v1/inventory/devices, method=POST, contentType=application/json, body={id=275ba018-01a8-447d-8a76-f9cb6f306dd4, id_data=, IdDataStruct=null, IdDataSha256=null, decommissioning=false, created_ts=0001-01-01T00:00:00Z, updated_ts=0001-01-01T00:00:00Z, auth_sets=null}, headers={X-MEN-RequestID=22c889dd-3bc4-4d39-84cd-2225407a4a21, Authorization=Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1ODMxOTUzMDksImp0aSI6IjgyM2VhYWFhLTY1ZDEtNDE3Mi1hMmY5LTY5Y2U1NGI2MTJiYSIsImlzcyI6Im1lbmRlci51c2VyYWRtIiwic3ViIjoiMTM3ZjVhYTItOTMwMi00MjNhLTk1YmItZjdjZDhiZjQ4Mjg1Iiwic2NwIjoibWVuZGVyLioiLCJtZW5kZXIudXNlciI6dHJ1ZX0.RRoFiF6HMNLfh_KNeur9wCO_2eJXLP7XhN2X5517KzhZ1NRJrBFzm8EmiuRSH6f-yO2tB5rhNHFg-Oxd2xWknIrDwZTwbG2GbnuGviaR9LkxDgmxxsNgp792jwJMWzRpToOyELglhgcmNZ9uZjIpucMHmzR7QbHYqC_DvMsdc1MvLq_2eEzQJl_SR3xU-U6Bc4cGNr8R8kVJRzrDfW2y4t42LTAbVK1oIRB-S7uvFI4fNumTwMObJxrNY46LC3kFBtw5qUxm6pCV1GbGFMb8zzAozNO5PuXvrLDMhwIrBu7kpmf_3urBcRdXqTUjU8Ng1hyVPi5Yq3l3lPJnsleazyGac3qy_O_oZqU8215Bf3JQUhZSYQwVskV9ZRpreMvP5DLkeVRX1_tlpJuF6WeklMCEfN28out3uQMjTV_8TwM0FrOAur7y5O81Ad7k1D-syjCP7qmvflnfURQz3cGXc3miw9S_1ZA8tMFWZe5P2X3p7csd5vXOresBeuyRCsqU}, connectionTimeOut=1000, readTimeOut=1000}, asyncComplete=false}, referenceTaskName='create_device_inventory', retryCount=0, seq=1, correlationId='null', pollCount=0, taskDefName='create_device_inventory', scheduledTime=1582591473635, startTime=0, endTime=0, updateTime=1582591473690, startDelayInSeconds=0, retriedTaskId='null', retried=false, executed=false, callbackFromWorker=true, responseTimeoutSeconds=0, workflowInstanceId='68ac03c3-c18c-4123-9875-dbcc69d03dcb', workflowType='provision_device', taskId='d22edae1-5b60-4c33-b194-e2ade322e033', reasonForIncompletion='null', callbackAfterSeconds=0, workerId='null', outputData={}, workflowTask=create_device_inventory/create_device_inventory, domain='null', inputMessage='null', outputMessage='null', rateLimitPerFrequency=0, rateLimitFrequencyInSeconds=1, externalInputPayloadStoragePath='null', externalOutputPayloadStoragePath='null'} fetched from execution DAO for taskId: d22edae1-5b60-4c33-b194-e2ade322e033
[2020-02-25 00:44:34,048] [system-task-worker-3] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Executing HTTP/d22edae1-5b60-4c33-b194-e2ade322e033-SCHEDULED
[2020-02-25 00:44:34,204] [system-task-worker-3] INFO  com.netflix.conductor.contribs.http.HttpTask  - response 201, null
[2020-02-25 00:44:34,309] [system-task-worker-3] INFO  com.netflix.conductor.core.execution.WorkflowExecutor  - Done Executing HTTP/d22edae1-5b60-4c33-b194-e2ade322e033-COMPLETED op={response={headers={X-Men-Requestid=[22c889dd-3bc4-4d39-84cd-2225407a4a21], Vary=[Accept-Encoding], Content-Length=[0], Date=[Tue, 25 Feb 2020 00:44:34 GMT], X-Inventory-Version=[unknown], Location=[devices/275ba018-01a8-447d-8a76-f9cb6f306dd4], Content-Type=[application/json; charset=utf-8]}, reasonPhrase=Created, body=null, statusCode=201}}

2-minio

attached log.

minio log:
 Host: 0.0.0.0:9000
Accept: */*
User-Agent: curl/7.61.1

[RESPONSE] [158272794.101153] [2020-02-26 14:39:01 +0000]
200 OK
Accept-Ranges: bytes
Vary: Origin
X-Xss-Protection: 1; mode=block
Content-Security-Policy: block-all-mixed-content
X-Amz-Request-Id: 15F6FAAD3C1345BC
Server: Minio/RELEASE.2018-09-25T21-34-43Z (linux; amd64)

[REQUEST LivenessCheckHandler] [158272794.101616] [2020-02-26 14:39:01 +0000]
GET /minio/health/live
Host: 0.0.0.0:9000
User-Agent: curl/7.61.1
Accept: */*

[RESPONSE] [158272794.101616] [2020-02-26 14:39:01 +0000]
200 OK
Content-Security-Policy: block-all-mixed-content
X-Amz-Request-Id: 15F6FAAD3C5A2018
Server: Minio/RELEASE.2018-09-25T21-34-43Z (linux; amd64)
Accept-Ranges: bytes
Vary: Origin
X-Xss-Protection: 1; mode=block

[REQUEST LivenessCheckHandler] [158272797.151708] [2020-02-26 14:39:31 +0000]
GET /minio/health/live
Host: 0.0.0.0:9000
User-Agent: curl/7.61.1
Accept: */*

[RESPONSE] [158272797.151708] [2020-02-26 14:39:31 +0000]
200 OK
Server: Minio/RELEASE.2018-09-25T21-34-43Z (linux; amd64)
Accept-Ranges: bytes
Vary: Origin
X-Xss-Protection: 1; mode=block
Content-Security-Policy: block-all-mixed-content
X-Amz-Request-Id: 15F6FAB45659232E

[REQUEST LivenessCheckHandler] [158272797.152164] [2020-02-26 14:39:31 +0000]
GET /minio/health/live
Host: 0.0.0.0:9000
User-Agent: curl/7.61.1
Accept: */*

[RESPONSE] [158272797.152164] [2020-02-26 14:39:31 +0000]
200 OK
Vary: Origin
X-Xss-Protection: 1; mode=block
Content-Security-Policy: block-all-mixed-content
X-Amz-Request-Id: 15F6FAB4569F19F1
Server: Minio/RELEASE.2018-09-25T21-34-43Z (linux; amd64)
Accept-Ranges: bytes

[REQUEST LivenessCheckHandler] [158272800.194524] [2020-02-26 14:40:01 +0000]
GET /minio/health/live
Host: 0.0.0.0:9000
User-Agent: curl/7.61.1
Accept: */*

[RESPONSE] [158272800.194524] [2020-02-26 14:40:01 +0000]
200 OK
Vary: Origin
X-Xss-Protection: 1; mode=block
Content-Security-Policy: block-all-mixed-content
X-Amz-Request-Id: 15F6FABB6C01CD8C
Server: Minio/RELEASE.2018-09-25T21-34-43Z (linux; amd64)
Accept-Ranges: bytes

[REQUEST LivenessCheckHandler] [158272800.195003] [2020-02-26 14:40:01 +0000]
GET /minio/health/live
Host: 0.0.0.0:9000
User-Agent: curl/7.61.1
Accept: */*

I can create some artifacts, (I don’t know why some do, and others don’t …)
In some cases everything is created well, I see the file inside the mini storage created, I can see it in artifacts and I can deploy, and everything goes well and deploys well on the device.

In some cases, the upload puts me 100%, but it stays there, it just didn’t create the artifact, and I don’t see it in minio either.

I have tried different types of files, ZIP, SH, PDF, IMG, etc.
I hope you can help me find the solution.

Thank you!

It looks like something has gone wrong during the update.

My recommendation is to store your work. Check out a fresh 2.3.0 branch, and see if it works.

If this does work, start by adding your own yaml [override files(https://docs.docker.com/compose/extends/)

This should give a clearer indication, and give us a clean base to debug this from I think :slight_smile:

Hi
I have tried to raise a server from 0 in branch 2.3.0b1, and I have the same problem as the one I have raised.

I use Ubuntu 18.04, 16GB ram over vmware.

The mender-driver and minio containers, continue to mark healthy, and I get the same problem when uploading files for release, some upload them and the process finishes well, and others do not.

I await your suggestions :slight_smile:

@blackdevice can you please provide us the logs from all the containers (if possible), or in particular create-artifact-worker and workflows-server?

Hi,

Of course, thank you very much for the help!

Here you can download a zip of the container records:
http://cloud.interactionfactory.es/index.php/s/Y381QdRc9kO2tAE

Thank you!

@blackdevice the logs are not congruent, I believe; the create-artifact-worker logs show two artifact generation requests:

    1. 2020-02-27T10:36:46Z
    1. 2020-02-27T10:59:27Z

The logs from the other containers (I’m interested in api-gateway and deployments) are from the 2nd of March, though.

Sorry,
I have returned to log more time, to see if it can help you see the problem.
Thank you!

http://cloud.interactionfactory.es/index.php/s/QgwpmeHF5L2U5K3

@blackdevice as I can see from the logs, the first generation requests successfully completed and the resulting artifact was uploaded to the deployments service. The second request was completed as well, but I cannot see any upload to the deployments service.

May I ask you to check the “workflows” mongo database, and the “jobs” collection in particular? It contains the result of the asynchronous job execution with actual outputs.